skip_navigation
WS 2021
LectureTypeSPPSECTS-CreditsCourse number
Design of Digital Integrated Circuits ILV 4,0 6,0 M-ISCD-1.02
Testing of Integrated Circuits ILV 3,5 4,0 M-ISCD-3.01
SS 2021
LectureTypeSPPSECTS-CreditsCourse number
Master Thesis MT 0,0 24,0 M-ISCD-4.01
Master Thesis - Seminar ILV 4,0 6,0 M-ISCD-4.02
TitelAutorJahr
6 Bit Pipeline ADC Design on thin film Transistors Technology for Fingerprint sensor Tarun Kumar AGARWAL 2020
Enabling Fault Injection Verification Moh'd ABU KHALIFA 2019
TitelAutorJahr
6 Bit Pipeline ADC Design on thin film Transistors Technology for Fingerprint sensor Tarun Kumar AGARWAL 2020
TitelAutorJahr
Enabling Fault Injection Verification Moh'd ABU KHALIFA 2019
TitelAutorJahr
Run-TimeMay/2021 - December/2022
Project management
  • Dongning Zhao
  • Project staff
  • Wolfgang Scherr
  • Manfred Ley
  • Mehdi MORADIAN BOVANLOO
  • Bernd Filipitsch
  • Corinna Maria Kudler
  • Vinayak Hande
  • ForschungsschwerpunktSensorik
    Studiengang
  • Integrated Systems and Circuits Design
  • ForschungsprogrammForschung, Entwicklung und Innovation, EFRE
    Förderinstitution/Auftraggeber
  • KWF - Kärntner Wirtschaftsförderungsfonds
  • Current robot systems are limited in their abilities of safe physical interaction with humans and objects in the real world. Tasks performed by robots are slow, rather simple, and not optimal as it is often necessary to isolate robots (e.g. behind fences) to avoid collisions. This makes these robots not suitable for user needs where they are expected to add value in several application areas. There is a critical socio-economic need to endow robots with the abilities of safe and effective interaction with humans and objects in their environment. One of the reasons for today's robots' limitations is the missing ability to comprehensively perceive the environment in real-time, understand situations, predict potential actions and reason about action consequences and risks. Such abilities are fundamental for interaction with humans and the world and for task specific adaptation of actions based on such interaction and experience. In order to address these topics, sensors for the perception of the environment which are co-designed with specific integrated circuits are core elements. Consequently, the scientific and technological objective of PATTERN-Skin is to develop a novel embodied bendable and potentially stretchable multimodal modular robot skin that provides robots with unprecedented sensing abilities facilitating contact-based/tactile and contact-less multimodal exploration of the world towards safe human-robot interaction. Besides the physical realization of the skin modules, also physically accurate real time simulations (“digital twin”) will be developed that allow to optimize and tailor skin configurations for robots and applications. Based on this sensor skin and the corresponding digital twin, PATTERN-Skin will investigate model based and AI-based methods to obtain representations of the environment towards the utilization in safe control strategies and aiming to meet requirements as defined e.g. in the ISO 15066 and 10218 safety standards. With respect to safe, reliable, and secure assembly of full systems from a number of individual sensor skin modules, a unified design pattern utilizing Near Field Communication (NFC) and hardware security elements will be investigated for both wired and wireless connectivity. By equipping robots with this enhanced sensing and interaction abilities, PATTERN-Skin is expected to impact a wide range of robotics applications ranging from personal care and assistance to agile logistics and manufacturing. The developed technologies and methods will be open, modular, and non-proprietary.

     

    Dieses Projekt wird aus Mitteln des Europäischen Fonds für regionale Entwicklung kofinanziert.

     

    REACT-EU ALS TEIL DER REAKTION DER UNION AUF DIE COVID-19-PANDEMIE FINANZIERT.

     

    Nähere Informationen zu IWB/EFRE finden Sie auf www.efre.gv.at

    Run-TimeJanuary/2019 - June/2022
    Homepage Projektwebseite
    Project management
  • Dongning Zhao
  • Project staff
  • Mehdi MORADIAN BOVANLOO
  • Ivan SEJC
  • Ram Ratnaker Reddy BODHA
  • David Kwaku Okyere DARKWAH
  • Klaudia LLESHI
  • ForschungsschwerpunktSensorik
    Studiengang
  • Integrated Systems and Circuits Design
  • ForschungsprogrammRegionale Impülsförderung
    Förderinstitution/Auftraggeber
  • Bundesministerium für Wissenschaft, Forschung und Wirtschaft (BMWFW)
  • The demand from industry for a shared human robot work environment for safe human robot collaboration has increased tremendously in the past years. The most demanding requirement is to ensure the inherent safety of the human in such a work environment and to fulfill the technical specification ISO/TS15066 for collaborative robots in the industrial context. Current research approaches utilize vision based solutions in combination with sensors mounted on the robot manipulator to detect an approaching human. One drawback of these solutions is the occurrence of occlusions (“blind spots”) due to, e.g., robot manipulator movement. In such a situation, the robot needs to go into an intrinsically safe mode, i.e. it has to reduce the speed of the manipulator thus significantly reducing the productivity. Consequently, the lack or rather the major restrictions of the currently available perception sensor technology with respect to measurement speed, range and integrability, etc. prevents high motion speed of collaborative robots. A central point of investigation in the project is the development of a novel perception sensor system, combining a variety of physical measurement principles (capacitive, ToF, etc.) in order to increase measurement rate, range, accuracy and resolution for position estimation and motion tracking in real time of a worker in the near surrounding of the workplace and robot manipulator. Furthermore, the new perception sensor system is fully integrated in the workplace and the robot manipulator. This new key technology enables the development of a Contactless and Safe Interaction Cell (CSIC), where a human can safely fulfill collaborative tasks jointly with a robot manipulator. Parts of the perception sensor are also utilized for a gesture based human robot interface. This allows for an intuitive interaction of the human with the robot manipulator, which will improve the user experience and increase the user acceptance. The user acceptance will be further fostered through the imitation of a human-human interaction behavior as the robot manipulator will mimic human behavior in the motion planning and control strategy of the robot manipulator. The new perception sensor technology will thus tremendously increase the operational speed of the robot manipulator in the CSIC further increasing the productivity of the collaborative human robot work cell while ensuring the safety of the human throughout the entire time and raising the human acceptance and user experience due to a human like intuitive interaction and control.

     

    Project goals:

    * Development of a modular human robot work cell (Contactless and Safe Interaction Cell)

    * Realtime perception sensor system
    * Realtime proximity sensor system
    * Capacitive to Digital Converter Sensor Chip

     

    Dieses Projekt wird aus Mitteln des EFRE Europäischen Fonds für regionale Entwicklung kofinanziert.
    www.efre.gv.at

    Nähere Informationen entnehmen Sie bitte der Webseite: https://www.efre.gv.at/

    Run-TimeSeptember/2018 - December/2018
    Homepage Projektwebseite
    Project management
  • Dongning Zhao
  • Project staff
  • Johannes Sturm
  • Ivan SEJC
  • Ivan SEJC
  • Mehdi MORADIAN BOVANLOO
  • ForschungsschwerpunktSensorik
    Studiengang
  • Integrated Systems and Circuits Design
  • ForschungsprogrammRegionale Impulsförderung
    Förderinstitution/Auftraggeber
  • KWF - Kärntner Wirtschaftsförderungsfonds
  • The demand from industry for a shared human robot work environment for safe human robot collaboration has increased tremendously in the past years. The most demanding requirement is to ensure the inherent safety of the human in such a work environment and to fulfill the technical specification ISO/TS15066 for collaborative robots in the industrial context. Current research approaches utilize vision based solutions in combination with sensors mounted on the robot manipulator to detect an approaching human. One drawback of these solutions is the occurrence of occlusions (“blind spots”) due to, e.g., robot manipulator movement. In such a situation, the robot needs to go into an intrinsically safe mode, i.e. it has to reduce the speed of the manipulator thus significantly reducing the productivity. Consequently, the lack or rather the major restrictions of the currently available perception sensor technology with respect to measurement speed, range and integrability, etc. prevents high motion speed of collaborative robots. A central point of investigation in the project is the development of a novel perception sensor system, combining a variety of physical measurement principles (capacitive, ToF, etc.) in order to increase measurement rate, range, accuracy and resolution for position estimation and motion tracking in real time of a worker in the near surrounding of the workplace and robot manipulator. Furthermore, the new perception sensor system is fully integrated in the workplace and the robot manipulator. This new key technology enables the development of a Contactless and Safe Interaction Cell (CSIC), where a human can safely fulfill collaborative tasks jointly with a robot manipulator. Parts of the perception sensor are also utilized for a gesture based human robot interface. This allows for an intuitive interaction of the human with the robot manipulator, which will improve the user experience and increase the user acceptance. The user acceptance will be further fostered through the imitation of a human-human interaction behavior as the robot manipulator will mimic human behavior in the motion planning and control strategy of the robot manipulator. The new perception sensor technology will thus tremendously increase the operational speed of the robot manipulator in the CSIC further increasing the productivity of the collaborative human robot work cell while ensuring the safety of the human throughout the entire time and raising the human acceptance and user experience due to a human like intuitive interaction and control.

     

    Project goals:

    * Development of a modular human robot work cell (Contactless and Safe Interaction Cell)

    * Realtime perception sensor system
    * Realtime proximity sensor system
    * Capacitive to Digital Converter Sensor Chip

     

    Dieses Projekt wird aus Mitteln des EFRE Europäischen Fonds für regionale Entwicklung kofinanziert.
    www.efre.gv.at

    Nähere Informationen entnehmen Sie bitte der Webseite: https://www.efre.gv.at/

    Run-TimeJanuary/2019 - June/2022
    Homepage Projektwebseite
    Project management
  • Dongning Zhao
  • Project staff
  • Mehdi MORADIAN BOVANLOO
  • Ivan SEJC
  • Ram Ratnaker Reddy BODHA
  • David Kwaku Okyere DARKWAH
  • Klaudia LLESHI
  • ForschungsschwerpunktSensorik
    Studiengang
  • Integrated Systems and Circuits Design
  • ForschungsprogrammRegionale Impülsförderung
    Förderinstitution/Auftraggeber
  • Bundesministerium für Wissenschaft, Forschung und Wirtschaft (BMWFW)
  • The demand from industry for a shared human robot work environment for safe human robot collaboration has increased tremendously in the past years. The most demanding requirement is to ensure the inherent safety of the human in such a work environment and to fulfill the technical specification ISO/TS15066 for collaborative robots in the industrial context. Current research approaches utilize vision based solutions in combination with sensors mounted on the robot manipulator to detect an approaching human. One drawback of these solutions is the occurrence of occlusions (“blind spots”) due to, e.g., robot manipulator movement. In such a situation, the robot needs to go into an intrinsically safe mode, i.e. it has to reduce the speed of the manipulator thus significantly reducing the productivity. Consequently, the lack or rather the major restrictions of the currently available perception sensor technology with respect to measurement speed, range and integrability, etc. prevents high motion speed of collaborative robots. A central point of investigation in the project is the development of a novel perception sensor system, combining a variety of physical measurement principles (capacitive, ToF, etc.) in order to increase measurement rate, range, accuracy and resolution for position estimation and motion tracking in real time of a worker in the near surrounding of the workplace and robot manipulator. Furthermore, the new perception sensor system is fully integrated in the workplace and the robot manipulator. This new key technology enables the development of a Contactless and Safe Interaction Cell (CSIC), where a human can safely fulfill collaborative tasks jointly with a robot manipulator. Parts of the perception sensor are also utilized for a gesture based human robot interface. This allows for an intuitive interaction of the human with the robot manipulator, which will improve the user experience and increase the user acceptance. The user acceptance will be further fostered through the imitation of a human-human interaction behavior as the robot manipulator will mimic human behavior in the motion planning and control strategy of the robot manipulator. The new perception sensor technology will thus tremendously increase the operational speed of the robot manipulator in the CSIC further increasing the productivity of the collaborative human robot work cell while ensuring the safety of the human throughout the entire time and raising the human acceptance and user experience due to a human like intuitive interaction and control.

     

    Project goals:

    * Development of a modular human robot work cell (Contactless and Safe Interaction Cell)

    * Realtime perception sensor system
    * Realtime proximity sensor system
    * Capacitive to Digital Converter Sensor Chip

     

    Dieses Projekt wird aus Mitteln des EFRE Europäischen Fonds für regionale Entwicklung kofinanziert.
    www.efre.gv.at

    Nähere Informationen entnehmen Sie bitte der Webseite: https://www.efre.gv.at/

    Run-TimeMay/2021 - December/2022
    Project management
  • Dongning Zhao
  • Project staff
  • Wolfgang Scherr
  • Manfred Ley
  • Mehdi MORADIAN BOVANLOO
  • Bernd Filipitsch
  • Corinna Maria Kudler
  • Vinayak Hande
  • ForschungsschwerpunktSensorik
    Studiengang
  • Integrated Systems and Circuits Design
  • ForschungsprogrammForschung, Entwicklung und Innovation, EFRE
    Förderinstitution/Auftraggeber
  • KWF - Kärntner Wirtschaftsförderungsfonds
  • Current robot systems are limited in their abilities of safe physical interaction with humans and objects in the real world. Tasks performed by robots are slow, rather simple, and not optimal as it is often necessary to isolate robots (e.g. behind fences) to avoid collisions. This makes these robots not suitable for user needs where they are expected to add value in several application areas. There is a critical socio-economic need to endow robots with the abilities of safe and effective interaction with humans and objects in their environment. One of the reasons for today's robots' limitations is the missing ability to comprehensively perceive the environment in real-time, understand situations, predict potential actions and reason about action consequences and risks. Such abilities are fundamental for interaction with humans and the world and for task specific adaptation of actions based on such interaction and experience. In order to address these topics, sensors for the perception of the environment which are co-designed with specific integrated circuits are core elements. Consequently, the scientific and technological objective of PATTERN-Skin is to develop a novel embodied bendable and potentially stretchable multimodal modular robot skin that provides robots with unprecedented sensing abilities facilitating contact-based/tactile and contact-less multimodal exploration of the world towards safe human-robot interaction. Besides the physical realization of the skin modules, also physically accurate real time simulations (“digital twin”) will be developed that allow to optimize and tailor skin configurations for robots and applications. Based on this sensor skin and the corresponding digital twin, PATTERN-Skin will investigate model based and AI-based methods to obtain representations of the environment towards the utilization in safe control strategies and aiming to meet requirements as defined e.g. in the ISO 15066 and 10218 safety standards. With respect to safe, reliable, and secure assembly of full systems from a number of individual sensor skin modules, a unified design pattern utilizing Near Field Communication (NFC) and hardware security elements will be investigated for both wired and wireless connectivity. By equipping robots with this enhanced sensing and interaction abilities, PATTERN-Skin is expected to impact a wide range of robotics applications ranging from personal care and assistance to agile logistics and manufacturing. The developed technologies and methods will be open, modular, and non-proprietary.

     

    Dieses Projekt wird aus Mitteln des Europäischen Fonds für regionale Entwicklung kofinanziert.

     

    REACT-EU ALS TEIL DER REAKTION DER UNION AUF DIE COVID-19-PANDEMIE FINANZIERT.

     

    Nähere Informationen zu IWB/EFRE finden Sie auf www.efre.gv.at

    Run-TimeJanuary/2019 - June/2022
    Homepage Projektwebseite
    Project management
  • Dongning Zhao
  • Project staff
  • Mehdi MORADIAN BOVANLOO
  • Ivan SEJC
  • Ram Ratnaker Reddy BODHA
  • David Kwaku Okyere DARKWAH
  • Klaudia LLESHI
  • ForschungsschwerpunktSensorik
    Studiengang
  • Integrated Systems and Circuits Design
  • ForschungsprogrammRegionale Impülsförderung
    Förderinstitution/Auftraggeber
  • Bundesministerium für Wissenschaft, Forschung und Wirtschaft (BMWFW)
  • The demand from industry for a shared human robot work environment for safe human robot collaboration has increased tremendously in the past years. The most demanding requirement is to ensure the inherent safety of the human in such a work environment and to fulfill the technical specification ISO/TS15066 for collaborative robots in the industrial context. Current research approaches utilize vision based solutions in combination with sensors mounted on the robot manipulator to detect an approaching human. One drawback of these solutions is the occurrence of occlusions (“blind spots”) due to, e.g., robot manipulator movement. In such a situation, the robot needs to go into an intrinsically safe mode, i.e. it has to reduce the speed of the manipulator thus significantly reducing the productivity. Consequently, the lack or rather the major restrictions of the currently available perception sensor technology with respect to measurement speed, range and integrability, etc. prevents high motion speed of collaborative robots. A central point of investigation in the project is the development of a novel perception sensor system, combining a variety of physical measurement principles (capacitive, ToF, etc.) in order to increase measurement rate, range, accuracy and resolution for position estimation and motion tracking in real time of a worker in the near surrounding of the workplace and robot manipulator. Furthermore, the new perception sensor system is fully integrated in the workplace and the robot manipulator. This new key technology enables the development of a Contactless and Safe Interaction Cell (CSIC), where a human can safely fulfill collaborative tasks jointly with a robot manipulator. Parts of the perception sensor are also utilized for a gesture based human robot interface. This allows for an intuitive interaction of the human with the robot manipulator, which will improve the user experience and increase the user acceptance. The user acceptance will be further fostered through the imitation of a human-human interaction behavior as the robot manipulator will mimic human behavior in the motion planning and control strategy of the robot manipulator. The new perception sensor technology will thus tremendously increase the operational speed of the robot manipulator in the CSIC further increasing the productivity of the collaborative human robot work cell while ensuring the safety of the human throughout the entire time and raising the human acceptance and user experience due to a human like intuitive interaction and control.

     

    Project goals:

    * Development of a modular human robot work cell (Contactless and Safe Interaction Cell)

    * Realtime perception sensor system
    * Realtime proximity sensor system
    * Capacitive to Digital Converter Sensor Chip

     

    Dieses Projekt wird aus Mitteln des EFRE Europäischen Fonds für regionale Entwicklung kofinanziert.
    www.efre.gv.at

    Nähere Informationen entnehmen Sie bitte der Webseite: https://www.efre.gv.at/

    Run-TimeMay/2021 - December/2022
    Project management
  • Dongning Zhao
  • Project staff
  • Wolfgang Scherr
  • Manfred Ley
  • Mehdi MORADIAN BOVANLOO
  • Bernd Filipitsch
  • Corinna Maria Kudler
  • Vinayak Hande
  • ForschungsschwerpunktSensorik
    Studiengang
  • Integrated Systems and Circuits Design
  • ForschungsprogrammForschung, Entwicklung und Innovation, EFRE
    Förderinstitution/Auftraggeber
  • KWF - Kärntner Wirtschaftsförderungsfonds
  • Current robot systems are limited in their abilities of safe physical interaction with humans and objects in the real world. Tasks performed by robots are slow, rather simple, and not optimal as it is often necessary to isolate robots (e.g. behind fences) to avoid collisions. This makes these robots not suitable for user needs where they are expected to add value in several application areas. There is a critical socio-economic need to endow robots with the abilities of safe and effective interaction with humans and objects in their environment. One of the reasons for today's robots' limitations is the missing ability to comprehensively perceive the environment in real-time, understand situations, predict potential actions and reason about action consequences and risks. Such abilities are fundamental for interaction with humans and the world and for task specific adaptation of actions based on such interaction and experience. In order to address these topics, sensors for the perception of the environment which are co-designed with specific integrated circuits are core elements. Consequently, the scientific and technological objective of PATTERN-Skin is to develop a novel embodied bendable and potentially stretchable multimodal modular robot skin that provides robots with unprecedented sensing abilities facilitating contact-based/tactile and contact-less multimodal exploration of the world towards safe human-robot interaction. Besides the physical realization of the skin modules, also physically accurate real time simulations (“digital twin”) will be developed that allow to optimize and tailor skin configurations for robots and applications. Based on this sensor skin and the corresponding digital twin, PATTERN-Skin will investigate model based and AI-based methods to obtain representations of the environment towards the utilization in safe control strategies and aiming to meet requirements as defined e.g. in the ISO 15066 and 10218 safety standards. With respect to safe, reliable, and secure assembly of full systems from a number of individual sensor skin modules, a unified design pattern utilizing Near Field Communication (NFC) and hardware security elements will be investigated for both wired and wireless connectivity. By equipping robots with this enhanced sensing and interaction abilities, PATTERN-Skin is expected to impact a wide range of robotics applications ranging from personal care and assistance to agile logistics and manufacturing. The developed technologies and methods will be open, modular, and non-proprietary.

     

    Dieses Projekt wird aus Mitteln des Europäischen Fonds für regionale Entwicklung kofinanziert.

     

    REACT-EU ALS TEIL DER REAKTION DER UNION AUF DIE COVID-19-PANDEMIE FINANZIERT.

     

    Nähere Informationen zu IWB/EFRE finden Sie auf www.efre.gv.at

    Run-TimeJanuary/2019 - June/2022
    Homepage Projektwebseite
    Project management
  • Dongning Zhao
  • Project staff
  • Mehdi MORADIAN BOVANLOO
  • Ivan SEJC
  • Ram Ratnaker Reddy BODHA
  • David Kwaku Okyere DARKWAH
  • Klaudia LLESHI
  • ForschungsschwerpunktSensorik
    Studiengang
  • Integrated Systems and Circuits Design
  • ForschungsprogrammRegionale Impülsförderung
    Förderinstitution/Auftraggeber
  • Bundesministerium für Wissenschaft, Forschung und Wirtschaft (BMWFW)
  • The demand from industry for a shared human robot work environment for safe human robot collaboration has increased tremendously in the past years. The most demanding requirement is to ensure the inherent safety of the human in such a work environment and to fulfill the technical specification ISO/TS15066 for collaborative robots in the industrial context. Current research approaches utilize vision based solutions in combination with sensors mounted on the robot manipulator to detect an approaching human. One drawback of these solutions is the occurrence of occlusions (“blind spots”) due to, e.g., robot manipulator movement. In such a situation, the robot needs to go into an intrinsically safe mode, i.e. it has to reduce the speed of the manipulator thus significantly reducing the productivity. Consequently, the lack or rather the major restrictions of the currently available perception sensor technology with respect to measurement speed, range and integrability, etc. prevents high motion speed of collaborative robots. A central point of investigation in the project is the development of a novel perception sensor system, combining a variety of physical measurement principles (capacitive, ToF, etc.) in order to increase measurement rate, range, accuracy and resolution for position estimation and motion tracking in real time of a worker in the near surrounding of the workplace and robot manipulator. Furthermore, the new perception sensor system is fully integrated in the workplace and the robot manipulator. This new key technology enables the development of a Contactless and Safe Interaction Cell (CSIC), where a human can safely fulfill collaborative tasks jointly with a robot manipulator. Parts of the perception sensor are also utilized for a gesture based human robot interface. This allows for an intuitive interaction of the human with the robot manipulator, which will improve the user experience and increase the user acceptance. The user acceptance will be further fostered through the imitation of a human-human interaction behavior as the robot manipulator will mimic human behavior in the motion planning and control strategy of the robot manipulator. The new perception sensor technology will thus tremendously increase the operational speed of the robot manipulator in the CSIC further increasing the productivity of the collaborative human robot work cell while ensuring the safety of the human throughout the entire time and raising the human acceptance and user experience due to a human like intuitive interaction and control.

     

    Project goals:

    * Development of a modular human robot work cell (Contactless and Safe Interaction Cell)

    * Realtime perception sensor system
    * Realtime proximity sensor system
    * Capacitive to Digital Converter Sensor Chip

     

    Dieses Projekt wird aus Mitteln des EFRE Europäischen Fonds für regionale Entwicklung kofinanziert.
    www.efre.gv.at

    Nähere Informationen entnehmen Sie bitte der Webseite: https://www.efre.gv.at/

    Run-TimeJanuary/2019 - June/2022
    Homepage Projektwebseite
    Project management
  • Dongning Zhao
  • Project staff
  • Mehdi MORADIAN BOVANLOO
  • Ivan SEJC
  • Ram Ratnaker Reddy BODHA
  • David Kwaku Okyere DARKWAH
  • Klaudia LLESHI
  • ForschungsschwerpunktSensorik
    Studiengang
  • Integrated Systems and Circuits Design
  • ForschungsprogrammRegionale Impülsförderung
    Förderinstitution/Auftraggeber
  • Bundesministerium für Wissenschaft, Forschung und Wirtschaft (BMWFW)
  • The demand from industry for a shared human robot work environment for safe human robot collaboration has increased tremendously in the past years. The most demanding requirement is to ensure the inherent safety of the human in such a work environment and to fulfill the technical specification ISO/TS15066 for collaborative robots in the industrial context. Current research approaches utilize vision based solutions in combination with sensors mounted on the robot manipulator to detect an approaching human. One drawback of these solutions is the occurrence of occlusions (“blind spots”) due to, e.g., robot manipulator movement. In such a situation, the robot needs to go into an intrinsically safe mode, i.e. it has to reduce the speed of the manipulator thus significantly reducing the productivity. Consequently, the lack or rather the major restrictions of the currently available perception sensor technology with respect to measurement speed, range and integrability, etc. prevents high motion speed of collaborative robots. A central point of investigation in the project is the development of a novel perception sensor system, combining a variety of physical measurement principles (capacitive, ToF, etc.) in order to increase measurement rate, range, accuracy and resolution for position estimation and motion tracking in real time of a worker in the near surrounding of the workplace and robot manipulator. Furthermore, the new perception sensor system is fully integrated in the workplace and the robot manipulator. This new key technology enables the development of a Contactless and Safe Interaction Cell (CSIC), where a human can safely fulfill collaborative tasks jointly with a robot manipulator. Parts of the perception sensor are also utilized for a gesture based human robot interface. This allows for an intuitive interaction of the human with the robot manipulator, which will improve the user experience and increase the user acceptance. The user acceptance will be further fostered through the imitation of a human-human interaction behavior as the robot manipulator will mimic human behavior in the motion planning and control strategy of the robot manipulator. The new perception sensor technology will thus tremendously increase the operational speed of the robot manipulator in the CSIC further increasing the productivity of the collaborative human robot work cell while ensuring the safety of the human throughout the entire time and raising the human acceptance and user experience due to a human like intuitive interaction and control.

     

    Project goals:

    * Development of a modular human robot work cell (Contactless and Safe Interaction Cell)

    * Realtime perception sensor system
    * Realtime proximity sensor system
    * Capacitive to Digital Converter Sensor Chip

     

    Dieses Projekt wird aus Mitteln des EFRE Europäischen Fonds für regionale Entwicklung kofinanziert.
    www.efre.gv.at

    Nähere Informationen entnehmen Sie bitte der Webseite: https://www.efre.gv.at/

    Run-TimeSeptember/2018 - December/2018
    Homepage Projektwebseite
    Project management
  • Dongning Zhao
  • Project staff
  • Johannes Sturm
  • Ivan SEJC
  • Ivan SEJC
  • Mehdi MORADIAN BOVANLOO
  • ForschungsschwerpunktSensorik
    Studiengang
  • Integrated Systems and Circuits Design
  • ForschungsprogrammRegionale Impulsförderung
    Förderinstitution/Auftraggeber
  • KWF - Kärntner Wirtschaftsförderungsfonds
  • The demand from industry for a shared human robot work environment for safe human robot collaboration has increased tremendously in the past years. The most demanding requirement is to ensure the inherent safety of the human in such a work environment and to fulfill the technical specification ISO/TS15066 for collaborative robots in the industrial context. Current research approaches utilize vision based solutions in combination with sensors mounted on the robot manipulator to detect an approaching human. One drawback of these solutions is the occurrence of occlusions (“blind spots”) due to, e.g., robot manipulator movement. In such a situation, the robot needs to go into an intrinsically safe mode, i.e. it has to reduce the speed of the manipulator thus significantly reducing the productivity. Consequently, the lack or rather the major restrictions of the currently available perception sensor technology with respect to measurement speed, range and integrability, etc. prevents high motion speed of collaborative robots. A central point of investigation in the project is the development of a novel perception sensor system, combining a variety of physical measurement principles (capacitive, ToF, etc.) in order to increase measurement rate, range, accuracy and resolution for position estimation and motion tracking in real time of a worker in the near surrounding of the workplace and robot manipulator. Furthermore, the new perception sensor system is fully integrated in the workplace and the robot manipulator. This new key technology enables the development of a Contactless and Safe Interaction Cell (CSIC), where a human can safely fulfill collaborative tasks jointly with a robot manipulator. Parts of the perception sensor are also utilized for a gesture based human robot interface. This allows for an intuitive interaction of the human with the robot manipulator, which will improve the user experience and increase the user acceptance. The user acceptance will be further fostered through the imitation of a human-human interaction behavior as the robot manipulator will mimic human behavior in the motion planning and control strategy of the robot manipulator. The new perception sensor technology will thus tremendously increase the operational speed of the robot manipulator in the CSIC further increasing the productivity of the collaborative human robot work cell while ensuring the safety of the human throughout the entire time and raising the human acceptance and user experience due to a human like intuitive interaction and control.

     

    Project goals:

    * Development of a modular human robot work cell (Contactless and Safe Interaction Cell)

    * Realtime perception sensor system
    * Realtime proximity sensor system
    * Capacitive to Digital Converter Sensor Chip

     

    Dieses Projekt wird aus Mitteln des EFRE Europäischen Fonds für regionale Entwicklung kofinanziert.
    www.efre.gv.at

    Nähere Informationen entnehmen Sie bitte der Webseite: https://www.efre.gv.at/


    Please use this link for external references on the profile of Dongning Zhao: www.fh-kaernten.at/mitarbeiter-details?person=d.zhao