Abstract. One of the aims of social robots is to make life easier for people in doing things more to their comfort and assisting them in some tasks. Having this purpose in mind and taking advantage of the widely used of infrared controlled home appliances, a controlling infrared devices system on board a robot has been developed. We focus on standard appliances with infrared interface, therefore no adjustments are needed either in the appliances or in the environment. The system has been implemented at the social robot named Maggie which has been equipped with an infrared reciever/transmitter. After teaching the desired commands to the robot, Maggie can govern the devices placed at several locations. At the beginning, a dialog between Maggie and the user is established. Once the robot has understood what the user wants, it will move toward the device and will send the required command. For testing the smooth running of the system, several trials have been done in our lab with a TV screen.
1 Introduction
Nowadays there are lot of devices operated by an infrared remote control at houses: televisions, air conditioning, VCRs, heating systems, etc. In addition, the expected wide spread of robots at home will reveal a new world of opportunities. So, in the next future, new household robots communicating with house appliances will appear. Besides, since remote controls have turned more and more complicated, we tried to elaborate a easier and more intuitive way of commanding them. We specially oriented our researches to children and elder people.
At this paper, a built-in voice operated remote control in a personal robot is presented. The required infrared commands for doing some tasks are easily taught from the original remote control and, subsequently, commands are sent by the robot upon request of the user.
One of the biggest advantages of the work is the robot interaction with regular appliances, it means without any change or adjustment of them. Therefore, high specialized expensive devices capable of being integrated with domotic controllers are not necessary and money is saved.
We are going to implement it in a mobile robot so it is possible to govern appliances at several locations. The robot will move to the area where the device to be controlled is and it will realize the entrusted task. In case something goes wrong (e.g. it can not reach the location) the robot will notify it. For example, a person in a room can order the robot to turn on the heating system which islocated in a different room, so the robot will move to the desired room and will execute the intended command.
This work has been developed and proved in our lab with a TV screen.
The rest of the paper is organized as follows. At the rest of section 1, previous works related to this one are presented and compared and the pursued goals are presented. Then, at section 2, we explain the robot Maggie and the AD architecture where this work is framed. Next, section 3 presents the proposed solution. After, the integration in AD is shown at section 4. Following, section 5 exposes the trials and how the system has been tested. Finally, some concluding remarks end the paper in section 6.
There has been several previous works related to. At [1], a system for controlling a computer by means of a pair of glasses equipped with infrared signals is developed. Here, complex hardware has to be attached to the computer in order to be able to communicate with it and user needs to be wearing the infrared glasses. In [2], the authors propose a platform for control appliances. The system has a robot with infrared receiver/transmiter and it is commanded by MSN. In this case, user needs a computer to send orders to the robot. The idea in [3] is to develop specific hardware to enable conventional appliances without telecommunication capabilities to connect to home networks so they can be remotely controlled. The week point is that new hardware has to be installed at each device. Another option is to install networked appliance to create a home network [4] that makes the control task easier but it has an important increase
of budget.
There are some other ways of communication that infrared. For example, [5] uses bluetooth as communication protocol to connect several home appliance that have to be provided with this interface.
Thinking of robots, some home, social or personal robots are able to interact with persons and home appliances. Toshiba Corporation has developed a concept model of the robotic information home appliance called ApriAlpha [6] [7]. ApriAlpha is a wheel locomotion type human friendly home robot which controls
advanced home appliances, standing between their users and them as a voice controlled information terminal. As a newer version of ApriAlpha, ApriPoko is basically a voice-operated infrared universal remote control which learns commands and it is connected to a laptop to process all data. For now it is just an R&D demonstrator device. Fujitsu Laboratories has developed an internetenabled home robot called MARON-1 (Mobile Agent Robot Of Next-generation) [8] as practicality-oriented household robots. MARON-1 can be remotely controlled from a mobile phone to monitor home security and operate household appliances which can respond to infrared remote control signals. All these robots are shown on fig.1.
1.2 Goals
Our target is to integrate an infrared device into Maggie robot and incorporate necessary software into the control architecture. It has to be able to operate unchanged infrared controlled gadgets with a robot by means of natural commands in human terms. In other words, user has to be able to interact with the robot just like with a person. The gadgets can be situated in different locations so the system must be capable to reach and face them. The module has been built in the experimental platform Maggie. An easy friendly communication is required because children, elder people and persons with disability will be the first potential users (fig.2).
Fig. 2. Operating appliances from the Social Robot Maggie
2 Frame of the work
The developed work in this paper has been implemented in the research robotic platform Maggie. Maggie is a personal robot intended for investigating humanrobot interaction and improve robot autonomy. It was conceived for personal assistance, making life easier at houses, help handicapped persons, to keep persons company, etc. Its external friendly look facilitates its social robot task.
2.1 Automatic-Deliberative Architecture
Maggie’s software is based in the two levels Automatic-Deliberative architecture (AD) [9] [10].The automatic level is linked to modules communicating with hardware, sensors and motors. At deliberative level, reasoning processes are placed. The communication between both levels is bidirectional and it is carried out by Short Term Memory and Events [11].
Events is the mechanism used by the architecture for working in a cooperative way. An event is an asynchronous signal for coordinating processes emitting and capturing them. The design is accomplished by the implementation of the publisher/subscriber design pattern so a skill generating events does not know
whether these events are received and processed by other skills or not.
The Short Term Memory is a memory area which can be accessed by different processes, where the most important data is stored. Different data types can be distributed and data are available to all elements of the AD architecture. The current and the previous value as well as the date of the data capture are stored.
Therefore, when writing a new data, the previous one is not eliminated, it is stored as the previous version. Short Term Memory allows to register and to eliminate data structures, reading and writing particular data, and several skills can share the same data.
The essential component in the AD architecture is the skill [11] and it is located in both levels. A skill is the capacity to reasoning, processing data or carry out actions. In terms of software engineering, a skill is a class hiding data and processes describing the global behavior of a robot task or action. The core of a skill is the control loop which could be running (skill is activated) or not.
Skills can be activated by other skills or by a sequencer, and they can give back data or events to the activating element or other skills interested in them. Skill are characterized for:
- They have three states: ready (just instantiated), activated (running the control loop) and locked (not running the control loop).
- Three ways of working: continuous, periodic and by events.
- Each skill is a process. Communication among processes is achieved by short term memory and events.
- A skill represents one or more tasks or a combination of several skills.
- Each skill has to be subscribed at least to an event and it has to define its behavior when the event arises.
A general overview of the system is shown at fig.3 where how the system works from start (user interaction with robot) to finish (robot sending the command) is displayed. As it is pointed out, the suggested solution is built by a system divided in two interfaces (human-robot and robot-appliance interfaces) and the module in charge of moving the robot close to the proper appliance.
Fig. 3. Overall flow diagram of the whole system
3.1 Human-Robot Interface
There are several ways of communication with the robot but we focus on natural communication in human terms. When user wants Maggie to turn air conditioning on, he has to transmit his intention to the robot like if he was interacting to other person. Thereby communication with the robot can be accomplished in different ways.
A natural and approachable interaction between users and robot is needed and verbal communication meets these requirements. In consequence, at this experiment, speech will form the human-robot interface. User speaks to robot and Maggie is able to understand speaking by a speech recognition software based on grammars. We connect one grammar rule per each infrared command Maggie will execute. Speech recognition system is modeled as a permanent skill, it is a skill which has been activated once and it will keep on it [12].
3.2 Robot-Appliance Interface
Here the main part of our work is presented. In order to send commands to infrared operated appliances, Maggie es equipped with IRTrans USB infrared control system [13]. It has been chosen because of its USB interface and Linux compatibility. It has been placed inside Maggie’s body behind a sphere which lets infrared signal goes through (fig.4).
Our chosen hardware is provided with all software required to work in a Linux environment: it has a TCP/IP server for accessing directly the hardware and replying clients requests, a trial client and libraries needed to program our own software.
Fig. 4. USB infrared device placed inside Maggie
Because of the nature of infrared technology, it is essential that robot is located and facing the appliance to communicate with. Hence it is fundamental a suitable navigation system.
4 Integration in AD architecture
The operating of the entire infrared system in the robot architecture is explained in this section. To accomplish our work, various skills have been used which are connected as it is shown in fig.5. The skills implicated are:
ASR Skill: Automatic Speech Recognition skill is in charged of informing
- about which grammar rule has been identified through the microphones. An event (REC OK), in addition to the detected grammar rule identifier, is sent to alert the rest of the architecture. This event will be catched by every skill subscribed to it, in our case a skill named Speech IR Control.
- Speech IR Control Skill: it is a data processing skill which translates an incoming event from ASR skill to a new event based on the identified grammar rule notifying the type of command. If the command is not related to the infrared system, the event is ignored. Other case, required information is stored at short term memory, it is the device to be controlled and the order to be sent to, for example ”turn on the tv“. Then, Speech IR Control skill indicates that Maggie has to move to the location where the device is by means of the GOTO event. If the position is reached, GOTO OKevent is received, the robot is ready for emitting the appropriated command, consequently CONTROL IR is sent. If not, the operation is aborted.
- GoTo Skill: After GoTo skill receives the GOTO event, it is intended to move to the position determined by data in Short Term Memory. In our case, this skill takes the name of the device to be operated from the Short Term Memory and it relates it to a pose (position and orientation) in an internal map of the world. If the desire position is reached, GOTO OK event will be sent. Other case, GOTO FAIL is sent.
- IR Remote Control Skill: the CONTROL IR event is captured by this skill. Thereafter it accesses info concerning to the corresponding command at Short Term Memory. Then, info is sent to the server and the right coding is gotten from the database where all available coding commands are. Now, the infrared hardware will emit this coding and it informs if everything has gone right.
Fig. 5. Communication among skills
Fig. 6. Sequence diagram of involved
5 Testing the system
In order to prove the smooth running of the system, we have developed in our lab several experiments as if it were at a real house. Our home appliance has been a TV screen and we have taught different commands to the robot just facing the remote control to the built-in infrared receiver and pressing the corresponding
buttons. At this point, the remote device name, the timings and the coding commands themselves are written to the data base that will be used by the server. Then, we link each command with a grammar rule in the speech recognition system. The operations learnt by the robot are ”turn on digital tv”, ”turn on digital radio”, ”channel up”, ”channel down”, ”volume up” and ”volume down”.
After lot of tests, it was observed that the system is running properly but with some key points. Robot pose is a very relevant factor because it is necessary that the infrared emitter is pointing to the TV screen and the direct line of sight has to be free of obstacles too. So the navigation system is key factor. Since human-robot interaction is leaded by oral communication, user speech has to be well understood by ASR skill. Thereby, a strong and efficient speech recognition system is mandatory.
The same process followed for the tv screen can be adjusted to every infrared devices effortless.
6 Conclusions
We have reached the goals proposed at the beginning of this document. A system for interacting with home appliances has been developed. The communication is conducted by the social robot Maggie performing household tasks. Humanrobot interaction is accomplished using a speech recognition system so, in human
terms, a natural interaction is realized: the communication with the robot is equal to the communication with a person. This aspect is really important since the primary users will be children, elderly people and disabled people (exception: speech impaired disability).
Commanding devices is carried out by a built-in infrared transmitter in Maggie. This is the regular technology utilized by most of the appliances, therefore it is not necessary to modify them or attach any electronic apparatus. For that reason the costs of installation, devices or appliances is not applicable.
Thanks to the robot navigation system, the presented system can operate infrared devices located in various rooms. The system could be expanded to accept commands through internet. In this way, a user at work could order the robot to turn on the air condition at home from the office. A different and feasible approach could be programming the robot to command devices in the future, e.g. the user will schedule the robot to turn off the air condition at 15:00.
Ideas presented at this paper are very useful for all users and in particular, for people with disability since our system will help them improving their quality of life and integrating them into society.
Acknowledgments The authors gratefully acknowledge the funds provided by the Spanish Government through the projects called Peer to Peer RobotHuman Interaction (R2H), of MEC (Ministry of Science and Education), and A new approach to social robotics (AROS), of MICINN ( Ministry of Science and Innovation).
References
- Yu-Luen Chen, Fuk-Tan Tang, Walter H. Chang, May-Kuen Wong, Ying-Ying Shih,Te-Son Kuo: The New Design of an Infrared-Controlled HumanComputer Interface for the Disabled. IEEE Transactions on Rehabilitation Engineering (December 1999)
Vol.7 No.4 474–481 - Liang-Yen Lin, Ming-Chun Cheng, Shyan-Ming Yuan: Standards-based User Interface Technology for Universal Home Domination. International Conference on Hybrid Information Technology (2006) Vol. 2 298–307
- Hiroshi Kuriyama, Hiroshi Mineno, Yasuhiro Seno, Takashi Furumura, Tadanori Mizuno: Evaluation of Home Appliance Translator for remote control of coventional home appliances. IEEE International Symposium on Power Line Communications and Its Applications (2007) 267–272
- Kolberg, M., Evan H. Magill, Wilson, M., Burtwistle, P., Ohlstenius, O.: Controlling Appliances with Pen and Paper. Second IEEE Consumer Communications and Networking Conference (2005) 156–160
- Ching-Shine Hwang, Tzuu-Shaang Wey, Yuan-Hung Lo: An Integration Platform for Developing Digital Life Applications. International Conference on Parallel and Distributed Systems (2007) Vol.2 1–2
- Takashi Yoshimi, Nobuto Matsuhira, Kaoru Suzuki, Daisuke Yamamoto, Fumio Ozaki, Junko Hirokawa, Hideki Ogawa: Development of a Concept Model of a Robotic Information Home Appliance, ApriAlpha. Proceedings IEEE/RSJ International Conference on Intelligent Robots and Systems (2004) Vol.1 205–211
- Nobuto Matsuhira, Fumio Ozaki, Hideki Ogawa, Takashi Yoshimi, Hideaki Hashimoto: Expanding Practicability of ApriAlpha in Cooperation with Networked Home Appliances. IEEE Workshop on Advanced Robotics and its Social Impacts (2005) 254–259
- FUJITSU LABORATORIES LTD.: Internet-enabled Home Robot MARON-1 http://jp.fujitsu.com/group/labs/en/business/activities/activities-4/#robotics (2008)
- Barber, R.,: Desarrollo de una Arquitectura para Robots M´oviles Aut´onomos. Aplicaci´on a un Sistema de Navegaci´on Topol´ogica. PhD Thesis at Carlos III University (2000)
- Rivas, R., Corrales, A., Barber, R., Salichs, M.A.: Robot Skill Abstraction for AD Architecture. 6th IFAC Symposium on Intelligent Autonomous Vehicles (2007)
- Castro-Gonz´alez, A.: Desde la Teleoperaci´on al Control por Tacto del Robot Maggie. Master Thesis at Carlos III University (2008)
- IRTrans GmbH: Universal IR Solutions. http://www.irtrans.de (2009)
Tiada ulasan:
Catat Ulasan