Loading ...
Sorry, an error occurred while loading the content.

Phantom Geometry: Large networked objects and realtime manipulation.

Expand Messages
  • surfer15a
    Hey guys! I just graduated from SCI-Arc with a M.Arch degree, and I wanted to share my graduate thesis with you. The project was completed with my partner and
    Message 1 of 11 , Sep 30, 2012
    View Source
    • 0 Attachment
      Hey guys!
      I just graduated from SCI-Arc with a M.Arch degree, and I wanted to share my graduate thesis with you. The project was completed with my partner and wife, Liz. You can see it here: vimeo.com/49888105

      This forum has been an amazing, excellent resource for us as we pushed our project along. Many people form here contributed to our project, especially Fernando (spaceCaptain), and Robert Cervellione. Fernando and Robert, thanks for all of your help and your patience answering our many questions. We have and will continue to cite you when/where appropriate. But many others contributed by posting inspiring posts or having written helpful responses to my questions, or helpful notes in a variety of archived threads. Mike, Ron Light, and Robert's kickstarter projects were especially inspiring to us.

      The technology: I don't have to tell you guys that this is obviously DLP/UV resin technology. We used an epson projector, mounted at the nearest focusable distance (about 400mm). We stripped out the color wheel. The layers (3.5mm thick) cured in about 90-180 seconds. We chose this thickness because it was the fastest speed we could find with the time and money we had for R&D. The speed slowed considerably during the project to 500+ seconds (maybe the bulb was dying?). We chose clear resin partly for aesthetic reasons, and partly to be able to cure thick layers. We we're not aiming for perfection (clearly!). We were aiming for speed and reliability. Also we wanted clear as a 'backup' for some other experiments not using DLP in case the DLP idea wasn't reliable. We found that with clear resin we could cure the 1mm of resin about as quickly as we could cure 3.5mm. Thick layers also meant fewer overall layers that we needed to monitor for adhesion. Clearly this low resolution is not desirable--but it was a sacrifice we were willing to make for our short experimental timeframe.

      The resin:
      Fernando was our advisor for the resin. He very generously offered us a great discount on his excellent resin, and worked hard to set up delivery of a big quantity for the project. Very professional and knowledgeable and I highly recommend involving him in your projects! Sadly, due to no one's fault, the shipment was lost by the USPS, and at the last minute we had to switch to Bucktown Polymer's ZVOC-500 resin due to our time-crunched school schedule. The BP resin was very, very powerfully smelly even in small quantities at room temperature. We used a flexible sylgard vat bottom, as inspired by Mike's experiments that he posted here (and to youtube). We used one sheet of ~2mm thick silicon for the 100 working hours, it was very durable. Fernando also got us in touch with Ron Light (3D Ink) who generously sent us a sample of odorless resin that cured amazingly fast. We wanted to use it! Sadly, because we inquired about it so late in our development, we had to proceed with the other vendor.

      The robot/software interface:
      The robot carrying the DLP/Resin is continually sending it's updated position to a Maya file with a virtual robot, who's position is therefore continually updated to match. The virtual maya robot has a virtual camera, focused on the plane of the virtual vat bottom (where the light from the projector hits in the physical world). Therefore, anything the virtual robot 'sees' is transmitted directly into the physical world. Any 3D geometry in the virtual maya lab is therefore 'sliced' and sent to the projector in realtime. In this way, the physical robot always 'knows where it is' in space and 'knows what to print'. You can modify this geometry as you go. Also, this allows a second robot to come in and meet exactly in the same space in the lab, and continue working.

      The workflow:
      The main idea here is that you can build a large, networked object by moving around the shallow soft-silicone vat within the intersecting workspheres of the robots. You can start in one place and allow the object to bifurcate, and then merge with other neighboring stalactites. The second important idea is that the light being transmitted to the DLP is accessible in real-time. We can (and did) modify the geometry as we printed. We could modify the 3D geometry in Maya in order to respond to geometric conditions that evolved as we worked (structural concerns, amount of resin left, aesthetics). Also we could and modify the 2D image of the sliced 3D geometry in Touch Deisgner right before we sent it to the projector--this is pretty neat--it allowed us to control layer thickness on the fly, and add perforations. However, any image manipulation that you can preform would have physical consequences. Very cool possibilities for scripting geometry here.

      Software:
      We used esperant.0 robotics software developed by Kruysman-Proto. And cobbled together some python scripts to get everything going.

      Sorry for the huge post!
      I'll try to answer any questions. And, thanks again!!

      kyle (surfer15a)
    • Lino
      Holy crap! That s amazing! From: diy_3d_printing_and_fabrication@yahoogroups.com [mailto:diy_3d_printing_and_fabrication@yahoogroups.com] On Behalf Of
      Message 2 of 11 , Sep 30, 2012
      View Source
      • 0 Attachment

        Holy crap! That’s amazing!

         

         

        From: diy_3d_printing_and_fabrication@yahoogroups.com [mailto:diy_3d_printing_and_fabrication@yahoogroups.com] On Behalf Of surfer15a
        Sent: Sunday, September 30, 2012 1:39 PM
        To: diy_3d_printing_and_fabrication@yahoogroups.com
        Subject: [diy_3d_printing_and_fabrication] Phantom Geometry: Large networked objects and realtime manipulation.

         

         

        Hey guys!
        I just graduated from SCI-Arc with a M.Arch degree, and I wanted to share my graduate thesis with you. The project was completed with my partner and wife, Liz. You can see it here: vimeo.com/49888105

        This forum has been an amazing, excellent resource for us as we pushed our project along. Many people form here contributed to our project, especially Fernando (spaceCaptain), and Robert Cervellione. Fernando and Robert, thanks for all of your help and your patience answering our many questions. We have and will continue to cite you when/where appropriate. But many others contributed by posting inspiring posts or having written helpful responses to my questions, or helpful notes in a variety of archived threads. Mike, Ron Light, and Robert's kickstarter projects were especially inspiring to us.

        The technology: I don't have to tell you guys that this is obviously DLP/UV resin technology. We used an epson projector, mounted at the nearest focusable distance (about 400mm). We stripped out the color wheel. The layers (3.5mm thick) cured in about 90-180 seconds. We chose this thickness because it was the fastest speed we could find with the time and money we had for R&D. The speed slowed considerably during the project to 500+ seconds (maybe the bulb was dying?). We chose clear resin partly for aesthetic reasons, and partly to be able to cure thick layers. We we're not aiming for perfection (clearly!). We were aiming for speed and reliability. Also we wanted clear as a 'backup' for some other experiments not using DLP in case the DLP idea wasn't reliable. We found that with clear resin we could cure the 1mm of resin about as quickly as we could cure 3.5mm. Thick layers also meant fewer overall layers that we needed to monitor for adhesion. Clearly this low resolution is not desirable--but it was a sacrifice we were willing to make for our short experimental timeframe.

        The resin:
        Fernando was our advisor for the resin. He very generously offered us a great discount on his excellent resin, and worked hard to set up delivery of a big quantity for the project. Very professional and knowledgeable and I highly recommend involving him in your projects! Sadly, due to no one's fault, the shipment was lost by the USPS, and at the last minute we had to switch to Bucktown Polymer's ZVOC-500 resin due to our time-crunched school schedule. The BP resin was very, very powerfully smelly even in small quantities at room temperature. We used a flexible sylgard vat bottom, as inspired by Mike's experiments that he posted here (and to youtube). We used one sheet of ~2mm thick silicon for the 100 working hours, it was very durable. Fernando also got us in touch with Ron Light (3D Ink) who generously sent us a sample of odorless resin that cured amazingly fast. We wanted to use it! Sadly, because we inquired about it so late in our development, we had to proceed with the other vendor.

        The robot/software interface:
        The robot carrying the DLP/Resin is continually sending it's updated position to a Maya file with a virtual robot, who's position is therefore continually updated to match. The virtual maya robot has a virtual camera, focused on the plane of the virtual vat bottom (where the light from the projector hits in the physical world). Therefore, anything the virtual robot 'sees' is transmitted directly into the physical world. Any 3D geometry in the virtual maya lab is therefore 'sliced' and sent to the projector in realtime. In this way, the physical robot always 'knows where it is' in space and 'knows what to print'. You can modify this geometry as you go. Also, this allows a second robot to come in and meet exactly in the same space in the lab, and continue working.

        The workflow:
        The main idea here is that you can build a large, networked object by moving around the shallow soft-silicone vat within the intersecting workspheres of the robots. You can start in one place and allow the object to bifurcate, and then merge with other neighboring stalactites. The second important idea is that the light being transmitted to the DLP is accessible in real-time. We can (and did) modify the geometry as we printed. We could modify the 3D geometry in Maya in order to respond to geometric conditions that evolved as we worked (structural concerns, amount of resin left, aesthetics). Also we could and modify the 2D image of the sliced 3D geometry in Touch Deisgner right before we sent it to the projector--this is pretty neat--it allowed us to control layer thickness on the fly, and add perforations. However, any image manipulation that you can preform would have physical consequences. Very cool possibilities for scripting geometry here.

        Software:
        We used esperant.0 robotics software developed by Kruysman-Proto. And cobbled together some python scripts to get everything going.

        Sorry for the huge post!
        I'll try to answer any questions. And, thanks again!!

        kyle (surfer15a)

      • Paolo Velcich
        WHOW ! ! ! ! ! I m speechless... That s really cool, one of the coolest performances I ve ever seen. That s true CYBER ARCHITECTURE. Clap, Clap, Clap .... I ve
        Message 3 of 11 , Sep 30, 2012
        View Source
        • 0 Attachment

          WHOW ! ! ! ! !

          I'm speechless...

          That's really cool, one of the coolest performances I've ever seen.

           

          That's true CYBER ARCHITECTURE.


          Clap, Clap, Clap ....

          I've no words, just admiration for something really sublime.

          And, BTW, I actually appreciate the rough resolution those stepped edges makes it so "digital", a smooth surface wouldn't be the same.

           

          Congratulations again.


          Paolo

           

           

          From: diy_3d_printing_and_fabrication@yahoogroups.com [mailto:diy_3d_printing_and_fabrication@yahoogroups.com] On Behalf Of surfer15a
          Sent: lunedì 1 ottobre 2012 00:39
          To: diy_3d_printing_and_fabrication@yahoogroups.com
          Subject: [diy_3d_printing_and_fabrication] Phantom Geometry: Large networked objects and realtime manipulation.

           

           

          Hey guys!
          I just graduated from SCI-Arc with a M.Arch degree, and I wanted to share my graduate thesis with you. The project was completed with my partner and wife, Liz. You can see it here: vimeo.com/49888105

          This forum has been an amazing, excellent resource for us as we pushed our project along. Many people form here contributed to our project, especially Fernando (spaceCaptain), and Robert Cervellione. Fernando and Robert, thanks for all of your help and your patience answering our many questions. We have and will continue to cite you when/where appropriate. But many others contributed by posting inspiring posts or having written helpful responses to my questions, or helpful notes in a variety of archived threads. Mike, Ron Light, and Robert's kickstarter projects were especially inspiring to us.

          The technology: I don't have to tell you guys that this is obviously DLP/UV resin technology. We used an epson projector, mounted at the nearest focusable distance (about 400mm). We stripped out the color wheel. The layers (3.5mm thick) cured in about 90-180 seconds. We chose this thickness because it was the fastest speed we could find with the time and money we had for R&D. The speed slowed considerably during the project to 500+ seconds (maybe the bulb was dying?). We chose clear resin partly for aesthetic reasons, and partly to be able to cure thick layers. We we're not aiming for perfection (clearly!). We were aiming for speed and reliability. Also we wanted clear as a 'backup' for some other experiments not using DLP in case the DLP idea wasn't reliable. We found that with clear resin we could cure the 1mm of resin about as quickly as we could cure 3.5mm. Thick layers also meant fewer overall layers that we needed to monitor for adhesion. Clearly this low resolution is not desirable--but it was a sacrifice we were willing to make for our short experimental timeframe.

          The resin:
          Fernando was our advisor for the resin. He very generously offered us a great discount on his excellent resin, and worked hard to set up delivery of a big quantity for the project. Very professional and knowledgeable and I highly recommend involving him in your projects! Sadly, due to no one's fault, the shipment was lost by the USPS, and at the last minute we had to switch to Bucktown Polymer's ZVOC-500 resin due to our time-crunched school schedule. The BP resin was very, very powerfully smelly even in small quantities at room temperature. We used a flexible sylgard vat bottom, as inspired by Mike's experiments that he posted here (and to youtube). We used one sheet of ~2mm thick silicon for the 100 working hours, it was very durable. Fernando also got us in touch with Ron Light (3D Ink) who generously sent us a sample of odorless resin that cured amazingly fast. We wanted to use it! Sadly, because we inquired about it so late in our development, we had to proceed with the other vendor.

          The robot/software interface:
          The robot carrying the DLP/Resin is continually sending it's updated position to a Maya file with a virtual robot, who's position is therefore continually updated to match. The virtual maya robot has a virtual camera, focused on the plane of the virtual vat bottom (where the light from the projector hits in the physical world). Therefore, anything the virtual robot 'sees' is transmitted directly into the physical world. Any 3D geometry in the virtual maya lab is therefore 'sliced' and sent to the projector in realtime. In this way, the physical robot always 'knows where it is' in space and 'knows what to print'. You can modify this geometry as you go. Also, this allows a second robot to come in and meet exactly in the same space in the lab, and continue working.

          The workflow:
          The main idea here is that you can build a large, networked object by moving around the shallow soft-silicone vat within the intersecting workspheres of the robots. You can start in one place and allow the object to bifurcate, and then merge with other neighboring stalactites. The second important idea is that the light being transmitted to the DLP is accessible in real-time. We can (and did) modify the geometry as we printed. We could modify the 3D geometry in Maya in order to respond to geometric conditions that evolved as we worked (structural concerns, amount of resin left, aesthetics). Also we could and modify the 2D image of the sliced 3D geometry in Touch Deisgner right before we sent it to the projector--this is pretty neat--it allowed us to control layer thickness on the fly, and add perforations. However, any image manipulation that you can preform would have physical consequences. Very cool possibilities for scripting geometry here.

          Software:
          We used esperant.0 robotics software developed by Kruysman-Proto. And cobbled together some python scripts to get everything going.

          Sorry for the huge post!
          I'll try to answer any questions. And, thanks again!!

          kyle (surfer15a)

        • cerverds
          it looks great i am glad to see you got it all worked out. the process looks messy and grotesque so Hernan must of loved it! I don t think i deserve any credit
          Message 4 of 11 , Sep 30, 2012
          View Source
          • 0 Attachment
            it looks great i am glad to see you got it all worked out. the process looks messy and grotesque so Hernan must of loved it! I don't think i deserve any credit though, i was happy to share what ever info i had. I cant wait to post pictures of the 'big one' that i got going. i just got back from maker fair today and i was thinking that there is presence from the architecture community missing because we do so much fabrication and 3d printing research. i would love to see this in one of the tents next year (if you can borrow the robots)

            -Robert
            --- In diy_3d_printing_and_fabrication@yahoogroups.com, "surfer15a" <kyle.wvh@...> wrote:
            >
            > Hey guys!
            > I just graduated from SCI-Arc with a M.Arch degree, and I wanted to share my graduate thesis with you. The project was completed with my partner and wife, Liz. You can see it here: vimeo.com/49888105
            >
            > This forum has been an amazing, excellent resource for us as we pushed our project along. Many people form here contributed to our project, especially Fernando (spaceCaptain), and Robert Cervellione. Fernando and Robert, thanks for all of your help and your patience answering our many questions. We have and will continue to cite you when/where appropriate. But many others contributed by posting inspiring posts or having written helpful responses to my questions, or helpful notes in a variety of archived threads. Mike, Ron Light, and Robert's kickstarter projects were especially inspiring to us.
            >
            > The technology: I don't have to tell you guys that this is obviously DLP/UV resin technology. We used an epson projector, mounted at the nearest focusable distance (about 400mm). We stripped out the color wheel. The layers (3.5mm thick) cured in about 90-180 seconds. We chose this thickness because it was the fastest speed we could find with the time and money we had for R&D. The speed slowed considerably during the project to 500+ seconds (maybe the bulb was dying?). We chose clear resin partly for aesthetic reasons, and partly to be able to cure thick layers. We we're not aiming for perfection (clearly!). We were aiming for speed and reliability. Also we wanted clear as a 'backup' for some other experiments not using DLP in case the DLP idea wasn't reliable. We found that with clear resin we could cure the 1mm of resin about as quickly as we could cure 3.5mm. Thick layers also meant fewer overall layers that we needed to monitor for adhesion. Clearly this low resolution is not desirable--but it was a sacrifice we were willing to make for our short experimental timeframe.
            >
            > The resin:
            > Fernando was our advisor for the resin. He very generously offered us a great discount on his excellent resin, and worked hard to set up delivery of a big quantity for the project. Very professional and knowledgeable and I highly recommend involving him in your projects! Sadly, due to no one's fault, the shipment was lost by the USPS, and at the last minute we had to switch to Bucktown Polymer's ZVOC-500 resin due to our time-crunched school schedule. The BP resin was very, very powerfully smelly even in small quantities at room temperature. We used a flexible sylgard vat bottom, as inspired by Mike's experiments that he posted here (and to youtube). We used one sheet of ~2mm thick silicon for the 100 working hours, it was very durable. Fernando also got us in touch with Ron Light (3D Ink) who generously sent us a sample of odorless resin that cured amazingly fast. We wanted to use it! Sadly, because we inquired about it so late in our development, we had to proceed with the other vendor.
            >
            > The robot/software interface:
            > The robot carrying the DLP/Resin is continually sending it's updated position to a Maya file with a virtual robot, who's position is therefore continually updated to match. The virtual maya robot has a virtual camera, focused on the plane of the virtual vat bottom (where the light from the projector hits in the physical world). Therefore, anything the virtual robot 'sees' is transmitted directly into the physical world. Any 3D geometry in the virtual maya lab is therefore 'sliced' and sent to the projector in realtime. In this way, the physical robot always 'knows where it is' in space and 'knows what to print'. You can modify this geometry as you go. Also, this allows a second robot to come in and meet exactly in the same space in the lab, and continue working.
            >
            > The workflow:
            > The main idea here is that you can build a large, networked object by moving around the shallow soft-silicone vat within the intersecting workspheres of the robots. You can start in one place and allow the object to bifurcate, and then merge with other neighboring stalactites. The second important idea is that the light being transmitted to the DLP is accessible in real-time. We can (and did) modify the geometry as we printed. We could modify the 3D geometry in Maya in order to respond to geometric conditions that evolved as we worked (structural concerns, amount of resin left, aesthetics). Also we could and modify the 2D image of the sliced 3D geometry in Touch Deisgner right before we sent it to the projector--this is pretty neat--it allowed us to control layer thickness on the fly, and add perforations. However, any image manipulation that you can preform would have physical consequences. Very cool possibilities for scripting geometry here.
            >
            > Software:
            > We used esperant.0 robotics software developed by Kruysman-Proto. And cobbled together some python scripts to get everything going.
            >
            > Sorry for the huge post!
            > I'll try to answer any questions. And, thanks again!!
            >
            > kyle (surfer15a)
            >
          • Spacecaptain
            What a beautyful project!! It was absolutely worthwile participating and I d do it all over again (but I d use a courier service as my first option this time
            Message 5 of 11 , Oct 1, 2012
            View Source
            • 0 Attachment
              What a beautyful project!!
              It was absolutely worthwile participating and I'd do it all over again (but I'd use a courier service as my first option this time :) )

               
              On 10/01/2012 01:19 AM, cerverds wrote:
               

              it looks great i am glad to see you got it all worked out. the process looks messy and grotesque so Hernan must of loved it! I don't think i deserve any credit though, i was happy to share what ever info i had. I cant wait to post pictures of the 'big one' that i got going. i just got back from maker fair today and i was thinking that there is presence from the architecture community missing because we do so much fabrication and 3d printing research. i would love to see this in one of the tents next year (if you can borrow the robots)

              -Robert
              --- In diy_3d_printing_and_fabrication@yahoogroups.com, "surfer15a" <kyle.wvh@...> wrote:
              >
              > Hey guys!
              > I just graduated from SCI-Arc with a M.Arch degree, and I wanted to share my graduate thesis with you. The project was completed with my partner and wife, Liz. You can see it here: vimeo.com/49888105
              >
              > This forum has been an amazing, excellent resource for us as we pushed our project along. Many people form here contributed to our project, especially Fernando (spaceCaptain), and Robert Cervellione. Fernando and Robert, thanks for all of your help and your patience answering our many questions. We have and will continue to cite you when/where appropriate. But many others contributed by posting inspiring posts or having written helpful responses to my questions, or helpful notes in a variety of archived threads. Mike, Ron Light, and Robert's kickstarter projects were especially inspiring to us.
              >
              > The technology: I don't have to tell you guys that this is obviously DLP/UV resin technology. We used an epson projector, mounted at the nearest focusable distance (about 400mm). We stripped out the color wheel. The layers (3.5mm thick) cured in about 90-180 seconds. We chose this thickness because it was the fastest speed we could find with the time and money we had for R&D. The speed slowed considerably during the project to 500+ seconds (maybe the bulb was dying?). We chose clear resin partly for aesthetic reasons, and partly to be able to cure thick layers. We we're not aiming for perfection (clearly!). We were aiming for speed and reliability. Also we wanted clear as a 'backup' for some other experiments not using DLP in case the DLP idea wasn't reliable. We found that with clear resin we could cure the 1mm of resin about as quickly as we could cure 3.5mm. Thick layers also meant fewer overall layers that we needed to monitor for adhesion. Clearly this low resolution is not desirable--but it was a sacrifice we were willing to make for our short experimental timeframe.
              >
              > The resin:
              > Fernando was our advisor for the resin. He very generously offered us a great discount on his excellent resin, and worked hard to set up delivery of a big quantity for the project. Very professional and knowledgeable and I highly recommend involving him in your projects! Sadly, due to no one's fault, the shipment was lost by the USPS, and at the last minute we had to switch to Bucktown Polymer's ZVOC-500 resin due to our time-crunched school schedule. The BP resin was very, very powerfully smelly even in small quantities at room temperature. We used a flexible sylgard vat bottom, as inspired by Mike's experiments that he posted here (and to youtube). We used one sheet of ~2mm thick silicon for the 100 working hours, it was very durable. Fernando also got us in touch with Ron Light (3D Ink) who generously sent us a sample of odorless resin that cured amazingly fast. We wanted to use it! Sadly, because we inquired about it so late in our development, we had to proceed with the other vendor.
              >
              > The robot/software interface:
              > The robot carrying the DLP/Resin is continually sending it's updated position to a Maya file with a virtual robot, who's position is therefore continually updated to match. The virtual maya robot has a virtual camera, focused on the plane of the virtual vat bottom (where the light from the projector hits in the physical world). Therefore, anything the virtual robot 'sees' is transmitted directly into the physical world. Any 3D geometry in the virtual maya lab is therefore 'sliced' and sent to the projector in realtime. In this way, the physical robot always 'knows where it is' in space and 'knows what to print'. You can modify this geometry as you go. Also, this allows a second robot to come in and meet exactly in the same space in the lab, and continue working.
              >
              > The workflow:
              > The main idea here is that you can build a large, networked object by moving around the shallow soft-silicone vat within the intersecting workspheres of the robots. You can start in one place and allow the object to bifurcate, and then merge with other neighboring stalactites. The second important idea is that the light being transmitted to the DLP is accessible in real-time. We can (and did) modify the geometry as we printed. We could modify the 3D geometry in Maya in order to respond to geometric conditions that evolved as we worked (structural concerns, amount of resin left, aesthetics). Also we could and modify the 2D image of the sliced 3D geometry in Touch Deisgner right before we sent it to the projector--this is pretty neat--it allowed us to control layer thickness on the fly, and add perforations. However, any image manipulation that you can preform would have physical consequences. Very cool possibilities for scripting geometry here.
              >
              > Software:
              > We used esperant.0 robotics software developed by Kruysman-Proto. And cobbled together some python scripts to get everything going.
              >
              > Sorry for the huge post!
              > I'll try to answer any questions. And, thanks again!!
              >
              > kyle (surfer15a)
              >


            • arthur2shedsj
              Great job! Also nice to see that you only trusted Stäubli for your robots. It s the only brand we use and recommend for precision applications. Bucktown
              Message 6 of 11 , Oct 1, 2012
              View Source
              • 0 Attachment
                Great job!

                Also nice to see that you only trusted Stäubli for your robots. It's the only brand we use and recommend for precision applications.

                Bucktown Polymer's Zero VOC ZVE500-V420 would have been a better fit for the application. It's just as inexpensive and incredibly strong but cures considerably faster especially in the 380-440nm range. But I'm glad the UV version worked out for you.
              • m3dp_vt
                This is amazing! I am attempting to build a 3D printer using a DLP projector with UV light, on a much smaller (micro) scale. I was wondering if you had a list
                Message 7 of 11 , Oct 28, 2012
                View Source
                • 0 Attachment
                  This is amazing!

                  I am attempting to build a 3D printer using a DLP projector with UV light, on a much smaller (micro) scale. I was wondering if you had a list of your components (projector, uv light source, etc.)

                  Thanks

                  --- In diy_3d_printing_and_fabrication@yahoogroups.com, "surfer15a" <kyle.wvh@...> wrote:
                  >
                  > Hey guys!
                  > I just graduated from SCI-Arc with a M.Arch degree, and I wanted to share my graduate thesis with you. The project was completed with my partner and wife, Liz. You can see it here: vimeo.com/49888105
                  >
                  > This forum has been an amazing, excellent resource for us as we pushed our project along. Many people form here contributed to our project, especially Fernando (spaceCaptain), and Robert Cervellione. Fernando and Robert, thanks for all of your help and your patience answering our many questions. We have and will continue to cite you when/where appropriate. But many others contributed by posting inspiring posts or having written helpful responses to my questions, or helpful notes in a variety of archived threads. Mike, Ron Light, and Robert's kickstarter projects were especially inspiring to us.
                  >
                  > The technology: I don't have to tell you guys that this is obviously DLP/UV resin technology. We used an epson projector, mounted at the nearest focusable distance (about 400mm). We stripped out the color wheel. The layers (3.5mm thick) cured in about 90-180 seconds. We chose this thickness because it was the fastest speed we could find with the time and money we had for R&D. The speed slowed considerably during the project to 500+ seconds (maybe the bulb was dying?). We chose clear resin partly for aesthetic reasons, and partly to be able to cure thick layers. We we're not aiming for perfection (clearly!). We were aiming for speed and reliability. Also we wanted clear as a 'backup' for some other experiments not using DLP in case the DLP idea wasn't reliable. We found that with clear resin we could cure the 1mm of resin about as quickly as we could cure 3.5mm. Thick layers also meant fewer overall layers that we needed to monitor for adhesion. Clearly this low resolution is not desirable--but it was a sacrifice we were willing to make for our short experimental timeframe.
                  >
                  > The resin:
                  > Fernando was our advisor for the resin. He very generously offered us a great discount on his excellent resin, and worked hard to set up delivery of a big quantity for the project. Very professional and knowledgeable and I highly recommend involving him in your projects! Sadly, due to no one's fault, the shipment was lost by the USPS, and at the last minute we had to switch to Bucktown Polymer's ZVOC-500 resin due to our time-crunched school schedule. The BP resin was very, very powerfully smelly even in small quantities at room temperature. We used a flexible sylgard vat bottom, as inspired by Mike's experiments that he posted here (and to youtube). We used one sheet of ~2mm thick silicon for the 100 working hours, it was very durable. Fernando also got us in touch with Ron Light (3D Ink) who generously sent us a sample of odorless resin that cured amazingly fast. We wanted to use it! Sadly, because we inquired about it so late in our development, we had to proceed with the other vendor.
                  >
                  > The robot/software interface:
                  > The robot carrying the DLP/Resin is continually sending it's updated position to a Maya file with a virtual robot, who's position is therefore continually updated to match. The virtual maya robot has a virtual camera, focused on the plane of the virtual vat bottom (where the light from the projector hits in the physical world). Therefore, anything the virtual robot 'sees' is transmitted directly into the physical world. Any 3D geometry in the virtual maya lab is therefore 'sliced' and sent to the projector in realtime. In this way, the physical robot always 'knows where it is' in space and 'knows what to print'. You can modify this geometry as you go. Also, this allows a second robot to come in and meet exactly in the same space in the lab, and continue working.
                  >
                  > The workflow:
                  > The main idea here is that you can build a large, networked object by moving around the shallow soft-silicone vat within the intersecting workspheres of the robots. You can start in one place and allow the object to bifurcate, and then merge with other neighboring stalactites. The second important idea is that the light being transmitted to the DLP is accessible in real-time. We can (and did) modify the geometry as we printed. We could modify the 3D geometry in Maya in order to respond to geometric conditions that evolved as we worked (structural concerns, amount of resin left, aesthetics). Also we could and modify the 2D image of the sliced 3D geometry in Touch Deisgner right before we sent it to the projector--this is pretty neat--it allowed us to control layer thickness on the fly, and add perforations. However, any image manipulation that you can preform would have physical consequences. Very cool possibilities for scripting geometry here.
                  >
                  > Software:
                  > We used esperant.0 robotics software developed by Kruysman-Proto. And cobbled together some python scripts to get everything going.
                  >
                  > Sorry for the huge post!
                  > I'll try to answer any questions. And, thanks again!!
                  >
                  > kyle (surfer15a)
                  >
                • luis h
                  Hi Kyle, I just saw your post from last year. Amazing job like everybody has mentioned here. I just finish my 3d slicer fully loaded with material support and
                  Message 8 of 11 , Mar 1, 2013
                  View Source
                  • 0 Attachment
                    Hi Kyle,

                    I just saw your post from last year. Amazing job like everybody has mentioned here. I just finish my 3d slicer fully loaded with material support and other features and now and focusing in the first prototype. I want to be able to put my printers machine on the market next year. I would like to know if you use a pdms vat because I can see any blinking of the projector (continues printing) and also what is the Ron light ink you are referring to?

                    Nice job,

                    Thank you,

                    Luis



                    --- In diy_3d_printing_and_fabrication@yahoogroups.com, "surfer15a" <kyle.wvh@...> wrote:
                    >
                    > Hey guys!
                    > I just graduated from SCI-Arc with a M.Arch degree, and I wanted to share my graduate thesis with you. The project was completed with my partner and wife, Liz. You can see it here: vimeo.com/49888105
                    >
                    > This forum has been an amazing, excellent resource for us as we pushed our project along. Many people form here contributed to our project, especially Fernando (spaceCaptain), and Robert Cervellione. Fernando and Robert, thanks for all of your help and your patience answering our many questions. We have and will continue to cite you when/where appropriate. But many others contributed by posting inspiring posts or having written helpful responses to my questions, or helpful notes in a variety of archived threads. Mike, Ron Light, and Robert's kickstarter projects were especially inspiring to us.
                    >
                    > The technology: I don't have to tell you guys that this is obviously DLP/UV resin technology. We used an epson projector, mounted at the nearest focusable distance (about 400mm). We stripped out the color wheel. The layers (3.5mm thick) cured in about 90-180 seconds. We chose this thickness because it was the fastest speed we could find with the time and money we had for R&D. The speed slowed considerably during the project to 500+ seconds (maybe the bulb was dying?). We chose clear resin partly for aesthetic reasons, and partly to be able to cure thick layers. We we're not aiming for perfection (clearly!). We were aiming for speed and reliability. Also we wanted clear as a 'backup' for some other experiments not using DLP in case the DLP idea wasn't reliable. We found that with clear resin we could cure the 1mm of resin about as quickly as we could cure 3.5mm. Thick layers also meant fewer overall layers that we needed to monitor for adhesion. Clearly this low resolution is not desirable--but it was a sacrifice we were willing to make for our short experimental timeframe.
                    >
                    > The resin:
                    > Fernando was our advisor for the resin. He very generously offered us a great discount on his excellent resin, and worked hard to set up delivery of a big quantity for the project. Very professional and knowledgeable and I highly recommend involving him in your projects! Sadly, due to no one's fault, the shipment was lost by the USPS, and at the last minute we had to switch to Bucktown Polymer's ZVOC-500 resin due to our time-crunched school schedule. The BP resin was very, very powerfully smelly even in small quantities at room temperature. We used a flexible sylgard vat bottom, as inspired by Mike's experiments that he posted here (and to youtube). We used one sheet of ~2mm thick silicon for the 100 working hours, it was very durable. Fernando also got us in touch with Ron Light (3D Ink) who generously sent us a sample of odorless resin that cured amazingly fast. We wanted to use it! Sadly, because we inquired about it so late in our development, we had to proceed with the other vendor.
                    >
                    > The robot/software interface:
                    > The robot carrying the DLP/Resin is continually sending it's updated position to a Maya file with a virtual robot, who's position is therefore continually updated to match. The virtual maya robot has a virtual camera, focused on the plane of the virtual vat bottom (where the light from the projector hits in the physical world). Therefore, anything the virtual robot 'sees' is transmitted directly into the physical world. Any 3D geometry in the virtual maya lab is therefore 'sliced' and sent to the projector in realtime. In this way, the physical robot always 'knows where it is' in space and 'knows what to print'. You can modify this geometry as you go. Also, this allows a second robot to come in and meet exactly in the same space in the lab, and continue working.
                    >
                    > The workflow:
                    > The main idea here is that you can build a large, networked object by moving around the shallow soft-silicone vat within the intersecting workspheres of the robots. You can start in one place and allow the object to bifurcate, and then merge with other neighboring stalactites. The second important idea is that the light being transmitted to the DLP is accessible in real-time. We can (and did) modify the geometry as we printed. We could modify the 3D geometry in Maya in order to respond to geometric conditions that evolved as we worked (structural concerns, amount of resin left, aesthetics). Also we could and modify the 2D image of the sliced 3D geometry in Touch Deisgner right before we sent it to the projector--this is pretty neat--it allowed us to control layer thickness on the fly, and add perforations. However, any image manipulation that you can preform would have physical consequences. Very cool possibilities for scripting geometry here.
                    >
                    > Software:
                    > We used esperant.0 robotics software developed by Kruysman-Proto. And cobbled together some python scripts to get everything going.
                    >
                    > Sorry for the huge post!
                    > I'll try to answer any questions. And, thanks again!!
                    >
                    > kyle (surfer15a)
                    >
                  • Light77
                    This is the resin Kyle was referring to http://buy3dink.com/p/59/uv-resin (Mixed and sold in the US) Fernando formulated this to be extra thin for use in top
                    Message 9 of 11 , Mar 2, 2013
                    View Source
                    • 0 Attachment
                      This is the resin Kyle was referring to http://buy3dink.com/p/59/uv-resin (Mixed and sold in the US)
                      Fernando formulated this to be extra thin for use in top down printers like this one http://sedgwick3d.com/
                      -Ron
                      On Fri, Mar 1, 2013 at 1:42 PM, luis h <luisguillermo98@...> wrote:
                       


                      Hi Kyle,

                      I just saw your post from last year. Amazing job like everybody has mentioned here. I just finish my 3d slicer fully loaded with material support and other features and now and focusing in the first prototype. I want to be able to put my printers machine on the market next year. I would like to know if you use a pdms vat because I can see any blinking of the projector (continues printing) and also what is the Ron light ink you are referring to?

                      Nice job,

                      Thank you,

                      Luis



                      --- In diy_3d_printing_and_fabrication@yahoogroups.com, "surfer15a" wrote:
                      >
                      > Hey guys!
                      > I just graduated from SCI-Arc with a M.Arch degree, and I wanted to share my graduate thesis with you. The project was completed with my partner and wife, Liz. You can see it here: vimeo.com/49888105
                      >
                      > This forum has been an amazing, excellent resource for us as we pushed our project along. Many people form here contributed to our project, especially Fernando (spaceCaptain), and Robert Cervellione. Fernando and Robert, thanks for all of your help and your patience answering our many questions. We have and will continue to cite you when/where appropriate. But many others contributed by posting inspiring posts or having written helpful responses to my questions, or helpful notes in a variety of archived threads. Mike, Ron Light, and Robert's kickstarter projects were especially inspiring to us.
                      >
                      > The technology: I don't have to tell you guys that this is obviously DLP/UV resin technology. We used an epson projector, mounted at the nearest focusable distance (about 400mm). We stripped out the color wheel. The layers (3.5mm thick) cured in about 90-180 seconds. We chose this thickness because it was the fastest speed we could find with the time and money we had for R&D. The speed slowed considerably during the project to 500+ seconds (maybe the bulb was dying?). We chose clear resin partly for aesthetic reasons, and partly to be able to cure thick layers. We we're not aiming for perfection (clearly!). We were aiming for speed and reliability. Also we wanted clear as a 'backup' for some other experiments not using DLP in case the DLP idea wasn't reliable. We found that with clear resin we could cure the 1mm of resin about as quickly as we could cure 3.5mm. Thick layers also meant fewer overall layers that we needed to monitor for adhesion. Clearly this low resolution is not desirable--but it was a sacrifice we were willing to make for our short experimental timeframe.
                      >
                      > The resin:
                      > Fernando was our advisor for the resin. He very generously offered us a great discount on his excellent resin, and worked hard to set up delivery of a big quantity for the project. Very professional and knowledgeable and I highly recommend involving him in your projects! Sadly, due to no one's fault, the shipment was lost by the USPS, and at the last minute we had to switch to Bucktown Polymer's ZVOC-500 resin due to our time-crunched school schedule. The BP resin was very, very powerfully smelly even in small quantities at room temperature. We used a flexible sylgard vat bottom, as inspired by Mike's experiments that he posted here (and to youtube). We used one sheet of ~2mm thick silicon for the 100 working hours, it was very durable. Fernando also got us in touch with Ron Light (3D Ink) who generously sent us a sample of odorless resin that cured amazingly fast. We wanted to use it! Sadly, because we inquired about it so late in our development, we had to proceed with the other vendor.
                      >
                      > The robot/software interface:
                      > The robot carrying the DLP/Resin is continually sending it's updated position to a Maya file with a virtual robot, who's position is therefore continually updated to match. The virtual maya robot has a virtual camera, focused on the plane of the virtual vat bottom (where the light from the projector hits in the physical world). Therefore, anything the virtual robot 'sees' is transmitted directly into the physical world. Any 3D geometry in the virtual maya lab is therefore 'sliced' and sent to the projector in realtime. In this way, the physical robot always 'knows where it is' in space and 'knows what to print'. You can modify this geometry as you go. Also, this allows a second robot to come in and meet exactly in the same space in the lab, and continue working.
                      >
                      > The workflow:
                      > The main idea here is that you can build a large, networked object by moving around the shallow soft-silicone vat within the intersecting workspheres of the robots. You can start in one place and allow the object to bifurcate, and then merge with other neighboring stalactites. The second important idea is that the light being transmitted to the DLP is accessible in real-time. We can (and did) modify the geometry as we printed. We could modify the 3D geometry in Maya in order to respond to geometric conditions that evolved as we worked (structural concerns, amount of resin left, aesthetics). Also we could and modify the 2D image of the sliced 3D geometry in Touch Deisgner right before we sent it to the projector--this is pretty neat--it allowed us to control layer thickness on the fly, and add perforations. However, any image manipulation that you can preform would have physical consequences. Very cool possibilities for scripting geometry here.
                      >
                      > Software:
                      > We used esperant.0 robotics software developed by Kruysman-Proto. And cobbled together some python scripts to get everything going.
                      >
                      > Sorry for the huge post!
                      > I'll try to answer any questions. And, thanks again!!
                      >
                      > kyle (surfer15a)
                      >


                    • Fernando Muñiz
                      Yes I recall working with Kyle on that project, enjoyed it a lot and also very inspiring and quite different set of requirements from the typical printer. I
                      Message 10 of 11 , Mar 2, 2013
                      View Source
                      • 0 Attachment
                        Yes I recall working with Kyle on that project, enjoyed it a lot and also very inspiring and quite different set of requirements from the typical printer.
                        I love the media and videos they did on that project!

                        On 03/02/2013 03:46 PM, Light77 wrote:
                         
                        This is the resin Kyle was referring to http://buy3dink.com/p/59/uv-resin (Mixed and sold in the US)
                        Fernando formulated this to be extra thin for use in top down printers like this one http://sedgwick3d.com/
                        -Ron
                        On Fri, Mar 1, 2013 at 1:42 PM, luis h <luisguillermo98@...> wrote:
                         


                        Hi Kyle,

                        I just saw your post from last year. Amazing job like everybody has mentioned here. I just finish my 3d slicer fully loaded with material support and other features and now and focusing in the first prototype. I want to be able to put my printers machine on the market next year. I would like to know if you use a pdms vat because I can see any blinking of the projector (continues printing) and also what is the Ron light ink you are referring to?

                        Nice job,

                        Thank you,

                        Luis



                        --- In diy_3d_printing_and_fabrication@yahoogroups.com, "surfer15a" wrote:
                        >
                        > Hey guys!
                        > I just graduated from SCI-Arc with a M.Arch degree, and I wanted to share my graduate thesis with you. The project was completed with my partner and wife, Liz. You can see it here: vimeo.com/49888105
                        >
                        > This forum has been an amazing, excellent resource for us as we pushed our project along. Many people form here contributed to our project, especially Fernando (spaceCaptain), and Robert Cervellione. Fernando and Robert, thanks for all of your help and your patience answering our many questions. We have and will continue to cite you when/where appropriate. But many others contributed by posting inspiring posts or having written helpful responses to my questions, or helpful notes in a variety of archived threads. Mike, Ron Light, and Robert's kickstarter projects were especially inspiring to us.
                        >
                        > The technology: I don't have to tell you guys that this is obviously DLP/UV resin technology. We used an epson projector, mounted at the nearest focusable distance (about 400mm). We stripped out the color wheel. The layers (3.5mm thick) cured in about 90-180 seconds. We chose this thickness because it was the fastest speed we could find with the time and money we had for R&D. The speed slowed considerably during the project to 500+ seconds (maybe the bulb was dying?). We chose clear resin partly for aesthetic reasons, and partly to be able to cure thick layers. We we're not aiming for perfection (clearly!). We were aiming for speed and reliability. Also we wanted clear as a 'backup' for some other experiments not using DLP in case the DLP idea wasn't reliable. We found that with clear resin we could cure the 1mm of resin about as quickly as we could cure 3.5mm. Thick layers also meant fewer overall layers that we needed to monitor for adhesion. Clearly this low resolution is not desirable--but it was a sacrifice we were willing to make for our short experimental timeframe.
                        >
                        > The resin:
                        > Fernando was our advisor for the resin. He very generously offered us a great discount on his excellent resin, and worked hard to set up delivery of a big quantity for the project. Very professional and knowledgeable and I highly recommend involving him in your projects! Sadly, due to no one's fault, the shipment was lost by the USPS, and at the last minute we had to switch to Bucktown Polymer's ZVOC-500 resin due to our time-crunched school schedule. The BP resin was very, very powerfully smelly even in small quantities at room temperature. We used a flexible sylgard vat bottom, as inspired by Mike's experiments that he posted here (and to youtube). We used one sheet of ~2mm thick silicon for the 100 working hours, it was very durable. Fernando also got us in touch with Ron Light (3D Ink) who generously sent us a sample of odorless resin that cured amazingly fast. We wanted to use it! Sadly, because we inquired about it so late in our development, we had to proceed with the other vendor.
                        >
                        > The robot/software interface:
                        > The robot carrying the DLP/Resin is continually sending it's updated position to a Maya file with a virtual robot, who's position is therefore continually updated to match. The virtual maya robot has a virtual camera, focused on the plane of the virtual vat bottom (where the light from the projector hits in the physical world). Therefore, anything the virtual robot 'sees' is transmitted directly into the physical world. Any 3D geometry in the virtual maya lab is therefore 'sliced' and sent to the projector in realtime. In this way, the physical robot always 'knows where it is' in space and 'knows what to print'. You can modify this geometry as you go. Also, this allows a second robot to come in and meet exactly in the same space in the lab, and continue working.
                        >
                        > The workflow:
                        > The main idea here is that you can build a large, networked object by moving around the shallow soft-silicone vat within the intersecting workspheres of the robots. You can start in one place and allow the object to bifurcate, and then merge with other neighboring stalactites. The second important idea is that the light being transmitted to the DLP is accessible in real-time. We can (and did) modify the geometry as we printed. We could modify the 3D geometry in Maya in order to respond to geometric conditions that evolved as we worked (structural concerns, amount of resin left, aesthetics). Also we could and modify the 2D image of the sliced 3D geometry in Touch Deisgner right before we sent it to the projector--this is pretty neat--it allowed us to control layer thickness on the fly, and add perforations. However, any image manipulation that you can preform would have physical consequences. Very cool possibilities for scripting geometry here.
                        >
                        > Software:
                        > We used esperant.0 robotics software developed by Kruysman-Proto. And cobbled together some python scripts to get everything going.
                        >
                        > Sorry for the huge post!
                        > I'll try to answer any questions. And, thanks again!!
                        >
                        > kyle (surfer15a)
                        >



                      • surfer15a
                        Hi Luis, Fernando makes excellent resins and is an excellent source of info during the development of a resin/project. He licenses his formulas to Ron (both
                        Message 11 of 11 , Mar 6, 2013
                        View Source
                        • 0 Attachment
                          Hi Luis,

                          Fernando makes excellent resins and is an excellent source of info during the development of a resin/project. He licenses his formulas to Ron (both Ron and Fernando replied above) in the US. I highly recommend working with one/both of them. They're products are fast curing and have almost no odor, which is a big improvement over many competitors. Regarding the PDMS-yes the bottom of the vat is a flexible 4mm thick sheet of PDMS. The image/layer is cured, then the machine jogs down and back up to cure the next layer, allowing the flexible PDMS to peel off. This is shown (not very clearly) in a few shots. But for the time lapse video sequences, we carefully edited out the down/up jogging because we thought it looked nicer and less jerky.
                          So, it's not continuous printing--does that answer your question?
                          The blinking you see occurs during the up/down jogging layer changing event because the printer is hooked up in real time and is cycling through layers as it moves. We could have told the projector not to display during that layer-changing event, but we liked the pretty blinking images ;)
                          I really would have liked to have tried one of these super-hydrophobic chemicals that are coming to market, they seem to be the best hope for true continuous printing in my opinion, but I'm far from an expert.
                          good luck with your project.
                          kyle

                          --- In diy_3d_printing_and_fabrication@yahoogroups.com, Fernando Muñiz <spacecaptain@...> wrote:
                          >
                          > Yes I recall working with Kyle on that project, enjoyed it a lot and
                          > also very inspiring and quite different set of requirements from the
                          > typical printer.
                          > I love the media and videos they did on that project!
                          >
                          > On 03/02/2013 03:46 PM, Light77 wrote:
                          > > This is the resin Kyle was referring to
                          > > http://buy3dink.com/p/59/uv-resin (Mixed and sold in the US)
                          > > Fernando formulated this to be extra thin for use in top down printers
                          > > like this one http://sedgwick3d.com/
                          > > -Ron
                          > > On Fri, Mar 1, 2013 at 1:42 PM, luis h <luisguillermo98@...
                          > > <mailto:luisguillermo98@...>> wrote:
                          > >
                          > >
                          > > Hi Kyle,
                          > >
                          > > I just saw your post from last year. Amazing job like everybody
                          > > has mentioned here. I just finish my 3d slicer fully loaded with
                          > > material support and other features and now and focusing in the
                          > > first prototype. I want to be able to put my printers machine on
                          > > the market next year. I would like to know if you use a pdms vat
                          > > because I can see any blinking of the projector (continues
                          > > printing) and also what is the Ron light ink you are referring to?
                          > >
                          > > Nice job,
                          > >
                          > > Thank you,
                          > >
                          > > Luis
                          > >
                          > >
                          > >
                          > > --- In diy_3d_printing_and_fabrication@yahoogroups.com
                          > > <mailto:diy_3d_printing_and_fabrication%40yahoogroups.com>,
                          > > "surfer15a" wrote:
                          > > >
                          > > > Hey guys!
                          > > > I just graduated from SCI-Arc with a M.Arch degree, and I wanted
                          > > to share my graduate thesis with you. The project was completed
                          > > with my partner and wife, Liz. You can see it here:
                          > > vimeo.com/49888105 <http://vimeo.com/49888105>
                          > > >
                          > > > This forum has been an amazing, excellent resource for us as we
                          > > pushed our project along. Many people form here contributed to our
                          > > project, especially Fernando (spaceCaptain), and Robert
                          > > Cervellione. Fernando and Robert, thanks for all of your help and
                          > > your patience answering our many questions. We have and will
                          > > continue to cite you when/where appropriate. But many others
                          > > contributed by posting inspiring posts or having written helpful
                          > > responses to my questions, or helpful notes in a variety of
                          > > archived threads. Mike, Ron Light, and Robert's kickstarter
                          > > projects were especially inspiring to us.
                          > > >
                          > > > The technology: I don't have to tell you guys that this is
                          > > obviously DLP/UV resin technology. We used an epson projector,
                          > > mounted at the nearest focusable distance (about 400mm). We
                          > > stripped out the color wheel. The layers (3.5mm thick) cured in
                          > > about 90-180 seconds. We chose this thickness because it was the
                          > > fastest speed we could find with the time and money we had for
                          > > R&D. The speed slowed considerably during the project to 500+
                          > > seconds (maybe the bulb was dying?). We chose clear resin partly
                          > > for aesthetic reasons, and partly to be able to cure thick layers.
                          > > We we're not aiming for perfection (clearly!). We were aiming for
                          > > speed and reliability. Also we wanted clear as a 'backup' for some
                          > > other experiments not using DLP in case the DLP idea wasn't
                          > > reliable. We found that with clear resin we could cure the 1mm of
                          > > resin about as quickly as we could cure 3.5mm. Thick layers also
                          > > meant fewer overall layers that we needed to monitor for adhesion.
                          > > Clearly this low resolution is not desirable--but it was a
                          > > sacrifice we were willing to make for our short experimental
                          > > timeframe.
                          > > >
                          > > > The resin:
                          > > > Fernando was our advisor for the resin. He very generously
                          > > offered us a great discount on his excellent resin, and worked
                          > > hard to set up delivery of a big quantity for the project. Very
                          > > professional and knowledgeable and I highly recommend involving
                          > > him in your projects! Sadly, due to no one's fault, the shipment
                          > > was lost by the USPS, and at the last minute we had to switch to
                          > > Bucktown Polymer's ZVOC-500 resin due to our time-crunched school
                          > > schedule. The BP resin was very, very powerfully smelly even in
                          > > small quantities at room temperature. We used a flexible sylgard
                          > > vat bottom, as inspired by Mike's experiments that he posted here
                          > > (and to youtube). We used one sheet of ~2mm thick silicon for the
                          > > 100 working hours, it was very durable. Fernando also got us in
                          > > touch with Ron Light (3D Ink) who generously sent us a sample of
                          > > odorless resin that cured amazingly fast. We wanted to use it!
                          > > Sadly, because we inquired about it so late in our development, we
                          > > had to proceed with the other vendor.
                          > > >
                          > > > The robot/software interface:
                          > > > The robot carrying the DLP/Resin is continually sending it's
                          > > updated position to a Maya file with a virtual robot, who's
                          > > position is therefore continually updated to match. The virtual
                          > > maya robot has a virtual camera, focused on the plane of the
                          > > virtual vat bottom (where the light from the projector hits in the
                          > > physical world). Therefore, anything the virtual robot 'sees' is
                          > > transmitted directly into the physical world. Any 3D geometry in
                          > > the virtual maya lab is therefore 'sliced' and sent to the
                          > > projector in realtime. In this way, the physical robot always
                          > > 'knows where it is' in space and 'knows what to print'. You can
                          > > modify this geometry as you go. Also, this allows a second robot
                          > > to come in and meet exactly in the same space in the lab, and
                          > > continue working.
                          > > >
                          > > > The workflow:
                          > > > The main idea here is that you can build a large, networked
                          > > object by moving around the shallow soft-silicone vat within the
                          > > intersecting workspheres of the robots. You can start in one place
                          > > and allow the object to bifurcate, and then merge with other
                          > > neighboring stalactites. The second important idea is that the
                          > > light being transmitted to the DLP is accessible in real-time. We
                          > > can (and did) modify the geometry as we printed. We could modify
                          > > the 3D geometry in Maya in order to respond to geometric
                          > > conditions that evolved as we worked (structural concerns, amount
                          > > of resin left, aesthetics). Also we could and modify the 2D image
                          > > of the sliced 3D geometry in Touch Deisgner right before we sent
                          > > it to the projector--this is pretty neat--it allowed us to control
                          > > layer thickness on the fly, and add perforations. However, any
                          > > image manipulation that you can preform would have physical
                          > > consequences. Very cool possibilities for scripting geometry here.
                          > > >
                          > > > Software:
                          > > > We used esperant.0 robotics software developed by
                          > > Kruysman-Proto. And cobbled together some python scripts to get
                          > > everything going.
                          > > >
                          > > > Sorry for the huge post!
                          > > > I'll try to answer any questions. And, thanks again!!
                          > > >
                          > > > kyle (surfer15a)
                          > > >
                          > >
                          > >
                          > >
                          >
                        Your message has been successfully submitted and would be delivered to recipients shortly.