25 September - 27 September | 18:00 - 22:00
Exhibition #2: HUMAN ♥ ROBOT
Aravinth Panchadcharam, GESTURE RECOGNITION FOR HUMAN-ROBOT-INTERACTION
Human-robot interaction (HRI) has been a topic of both science fiction and academic speculation even before any robots existed. HRI research is focusing to build an intuitive and easy communication with the robot through speech, gestures and facial expressions. The use of hand gestures provides a better solution than conventional human-machine interfaces. Furthermore, translations of hand gestures can help in accomplishing the ease and naturalness desired for HRI. This has motivated a very active research concerned with computer vision-based analysis and interpretation of hand gestures.
In this project, he implemented the hand gesture recognition for a humanoid robot Aldebaran NAO by modelling, training, classifying and recognising gestures based on computer vision algorithms and machine learning techniques such as Adaptive Naive Bayesian Classifier. Gestures are modelled based on skeletal points and the features are extracted using NiTE framework using a depth camera Asus Xtion. As a result, on one hand, gestures will be used to command the robot to execute certain actions and on the other hand, gestures will be translated and spoken out by the robot.
This project is implemented at DAI Labor with the help of Dr. Yuan Xu who is a passionate Roboticist. https://github.com/AravinthPanch/gesture-recognition-for-human-robot-interaction
DAInamite Team of Robocup Standard Platform League (SPL)
DAInamite Team had first participated in Robocup SPL in the year 2013 in Eindhoven, Netherlands with its five robots of NAOs V4 H2. The team consists of Bachelor and Master students from the Technical University Berlin who are backed by researchers working at the Distributed Artificial Intelligence (DAI) Laboratory.
Prof. Dr. Dr. h.c. Sahin Albayrak is the founder and head of the DAI-Lab. The DAI-Lab performs applied research and development of new systems and services, and apply and test the solutions in real environments to make them tangible for users. Current application fields are electromobility, smart grid, home, health, ambient assisted living, security, and service robotics.
Ingo Randolf, URBAN NEEDLE
Urban needle scans its environment in an automated process, transforming the shape of the room into sound. like a record needle it plays back a room, producing room-based noise. the room is the reel. urban needle sonifies architectures and urban spaces. http://ingorandolf.info/
So Kanno, DIASPORA LETTERS
This is an asemic writing project which uses machine learning to abstract hand writing letters by means of an Artificial Intelligence (A.I.) extracting shapes and patterns from strokes with ignoring meaning of them. Machine Learning Programming by Hironori Sakamoto.
Humans can recognise what language it is without understanding meaning when he/she hears someone’s conversation on street, if he/she have heard of the language. it means human learning sound without meaning from language. Will it happen to letters too? http://kanno.so/
Thomas O’Reilly, CHARLOTTE
Charlotte is an interactive drawing robot that moves by sensing and reacting to changes of light. It is a tool a created for those curious about new ways of exploring their environment. Charlotte can be easily replicated through an open source maker kit that has been designed to demystify the ‘black box’ by revealing insights into electronics and code through its assembly. http://www.toreilly.com/
Aravinth Panchadcharam was introduced to computer in 1992 by his father who is an Electrical Engineer and Architect. However, his interests in digital technologies pushed him to transform his artistic skills and gained an Associate's Degree in Graphics and Animation while he was a teenager. Since then he has worked on Interactive Portfolios, Graphic Designs, Illustrations and Animations for various clients as a freelancer. In the year 2004, he progressed to pursue Bachelor’s Degree in Electronics and Communication Engineering, where he was expertised in Electronics, Analog and Digital Communication Technologies, Microcontrollers, Embedded Programming, Digital Signal Processing and Robotics. As a result, he built 2 DOF Robot that can be controlled with GSM from anywhere in the world. In the year 2011, he moved to Berlin to pursue Master’s Degree in Electrical Engineering. He gained greater knowledge of research methodologies from the projects in the field of Autonomous Vehicle, Vehicle-To-Vehicle Communication, Biometrics, Internet Of Things and Wireless Mesh Networks, Robotics, Computer Vision, Artificial Intelligence and Machine Learning.
Ingo Randolf was born 1977 in Salzburg/Austria. He finished his study of audiovisual Media at the Art University Linz in 2005. He co-founded the VJ-group bildstrom in 1998 and realised various projects (short-films, installations, video-performances) while being an artist in Linz before moving to Göteborg/Sweden in 2010 and to Berlin in 2012. His artistic pratice often involves some kind of technology. He likes to build stuff.
Thomas O’Reilly is a phenomenologist interested in creating products and experiences that make people more human. After studying Product Design at Nottingham Trent University he moved to Berlin to explore new ways of connecting people with technology.
So Kanno graduated from the Design Informatics program at Musashino Art University, completing the Institute of Advanced Media Art and Science. He uses technology, focusing on specific technology matters like relation between signal and noise, error and glitch. Making things that he want to see and observe.