Drawing robots are not uncommon these days, but TsunComp offers a unique feature: the ability to create custom drawings based on direct requests. Our developed pipeline is a fully automated speech to machine code process, that uses GPT3.5 and StableDiffusion at its core. It sends out the data to a customly designed plotter that then draws the generated image for you using a pen, which results in a physical memory that you can hold on to.
2023
University module
Andreas Kohler
Elia Salerno
David Polke
Stepan Vedunov
Industrial Design
GUI
Our primary motivation for the project emerged from the desire to explore novel methods of interacting with LLMs. Consequently, we prioritized research into user input and interaction, defining the project's central objective as replicating AI sentience through natural, human-like interaction. In order to get an overview of the scope and divide the tasks reasonably equal, we developed a pipeline visualization that guided us throughout the whole process.
To create a robot that users would find inviting for interaction, we dedicated significant effort to the physical design of TsunComp. Our goal was to achieve a balance between a recognizable robotic appearance and a human-like feel. Therefore, our design approach centered on incorporating rounded shapes and familiar materials. With said goal in mind, we developed a straightforward GUI that accepts speech input, resembling voice messages in social media apps.
Our final product serves as a proof of concept for pseudo-sentience in AI. We achieve this by implementing human-like features into the design, the interaction and the behavior of a robot. This integration helps us understand the field of HRI to a deeper degree and offers a foundation for creating more compelling experiences in the future.
Throughout the module, our team stayed eager and motivated to do our best. We were satisfied with our exploration of AI, from conceptual theory to applying it in robotics. All in all, our main obstacle was inefficient time management towards the end of the two weeks. We struggled particularly with the final stages of the software pipeline development and its integration into the physical robot.