Professor Alex Zelinsky

Professor Alex Zelinsky

Vice-Chancellor

Office of the Vice-Chancellor

Career Summary

Biography

Professor Alex Zelinsky AO is the University of Newcastle’s 8th Vice-Chancellor and President and commenced in the role on 19 November 2018.

Prior to joining the University, Professor Zelinsky was Australia’s Chief Defence Scientist and leader of Defence Science and Technology within the Department of Defence.

Professor Zelinsky’s scientific career includes working as a computer scientist, systems engineer and roboticist and spans innovation, science and technology, research, commercial start-ups and education.

Prior to joining Defence, Professor Zelinsky was Group Executive for Information Sciences at CSIRO. Previously, he was Chief Executive Officer and co-founder of Seeing Machines, a technology company focused on computer vision. The company is listed on the London Stock Exchange and was a start-up from the Australian National University, where he was a Professor of Systems Engineering.

Professor Zelinsky has a Bachelor of Mathematical Sciences (Honours), Doctor of Philosophy and Honorary Doctor of Science from the University of Wollongong, and is a Graduate of the Australian Institute of Company Directors. He has completed the Advanced Management Program from Harvard University and the Senior Executive Program from London Business School.

Professor Zelinsky has received national and international awards in recognition of his work. In 2017, he was appointed an Officer in the Order of Australia (AO).  He has been included in Engineers Australia’s list of the 100 most influential engineers since 2009 and in 2015, Engineers Australia awarded him the prestigious M A Sargent Medal. In 2013, he was awarded the Pearcey Medal, the ICT industry’s premier prize for lifetime achievement. In 2003, 2004 and 2005, the World Economic Forum selected Professor Zelinsky as a Technology Pioneer. In 2005 Professor Zelinsky received the Clunies-Ross Award for successful innovation for the benefit of Australia. In 2000 Professor Zelinsky received the BHERT International award for his university-business research collaboration. He is a Fellow of the Institute of Electrical and Electronics Engineers, the Australian Academy of Technology and Engineering, the Institute of Engineers Australia and the Australian Institute of Company Directors.

Professor Zelinsky has been awarded over $7 million in competitive grants and industry funding, including from the Australian Council, Volvo, MITI and Ausindustry.


Qualifications

  • Doctor of Philosophy, University of Wollongong
  • Bachelor of Mathematics, University of Wollongong

Keywords

  • Mobile robots
  • Robot sensing systems
  • Robotics and automation
  • Systems engineering and theory
  • Vehicle safety

Languages

  • Japanese (Fluent)
  • Russian (Fluent)

Fields of Research

Code Description Percentage
090602 Control Systems, Robotics and Automation 100

Professional Experience

UON Appointment

Title Organisation / Department
Vice-Chancellor University of Newcastle
Office of the Vice-Chancellor
Australia

Academic appointment

Dates Title Organisation / Department
1/01/2004 - 31/12/2016 Adjunct Professor Australian National University
Australia
1/01/2000 - 1/07/2004 Professor Australian National University
Research School of Information Sciences and Engineering
Australia
1/10/1996 - 1/01/2000 Senior Fellow Australian National University
Research School of Information Sciences and Engineering
Australia
1/03/1995 - 1/10/1996 Senior Lecturer University of Wollongong
Australia
1/01/1992 - 1/02/1993 Senior Lecturer University of Wollongong
Computer Science
Australia
1/02/1984 - 1/12/1991 Lecturer University of Wollongong
Computer Science
Australia

Membership

Dates Title Organisation / Department
1/01/2017 - 19/12/2018 Research and Innovation Advisory Swinburne University of Technology
Australia
1/01/2017 - 19/12/2018 Advisory Board - ARC Centre of Excellence for Engineered Quantum Systems (EQUS) ARC (Australian Research Council)
Australia
1/09/2016 - 19/12/2018 Vice Chancellors Advisory University of Technology, Sydney
Australia
1/06/2016 - 19/12/2018 STEM Group Male Champions of Change
Australia
1/01/2016 - 19/12/2018 Advisory Board - ARC Centre of Excellence - Quantum Computing ARC (Australian Research Council)
Australia
1/01/2015 - 19/12/2018 Chair of Advisory Board - ARC Centre of Excellence - Robotic Vision ARC (Australian Research Council)
Australia
1/07/2014 - 19/12/2018 Co-chair - National Security S&T Advisory Australian Government
Australia
1/01/2013 - 19/12/2018 Advisory Board - National Science Technology Development Agency, Thailand National Science Technology Development Agency
Thailand
1/05/2012 - 1/12/2013 Reporting to Defence Minister - Defence Industry Innovation Board Australian Government
Australia
1/05/2012 - 19/12/2018 National Science and Research Committee Australian Government
Australia
1/01/2012 - 19/12/2018 Principal - The Technology Cooperation Program (5-eyes AUS, US, UK, CA, NZ) Australian Government
Australia
1/01/2012 - 1/12/2015 University Council University of Wollongong
Australia
1/01/2012 - 19/12/2018 Australian Defence Committee Australian Government
Australia
1/01/2012 - 19/12/2018 IEEE Robotics & Automation, International Medal Committee (Chair) IEEE
United States
1/05/2009 - 1/01/2012 Reporting to Minister Senator K. Carr - IT Industry Innovation Council Australian Government
Australia
1/01/2009 - 19/12/2018 Advisory Board - EU Robotics Program for Open Innovation European Commission, European Union
1/06/2008 - 1/12/2012 Super Computer Advisory Panel, National Computer Infrastructure Australian Government
Australia
1/04/2008 - 1/12/2009 Working Group on Innovation, Prime Minister's SE&I Council Australian Government
Australia
1/01/2008 - 31/12/2012 Vice President - IEEE Robotics & Automation Society, Indsutrial Activities IEEE
United States
1/01/2007 - 31/12/2013 Advisory Board - ARC Centre of Excellence - Autonomous Systems ARC (Australian Research Council)
Australia
1/12/2006 - 1/06/2009 Board member Funnelback Pty Ltd
Australia
1/06/2005 - 1/03/2012 Public R & D Advisory Panel, National ICT Round Table Australian Government
Australia
1/06/2005 - 1/03/2012 Board member Epicorp Ltd
Australia
1/02/2005 - 1/07/2007 Reporting to Minister Senator H. Coonan, ICT Advisory Board Australian Government
Australia
1/01/2005 - 31/12/2007 Member of Administrative Council - IEEE Robotics & Automation Society IEEE
United States
1/12/2004 - 1/04/2012 Industry Advisory Panel, Australian Industry Group Australian Government
Australia
1/01/2004 - 31/12/2013 Advisory Board - ARC Centre of Excellence - Vision Sciences ARC (Australian Research Council)
Australia
1/01/2004 - 31/12/2012 Editorial Board - IEEE Transactions on Intelligent Transportation Systems IEEE
United States
1/06/2000 - 1/01/2014 Board member Seeing Machines Ltd
Australia
1/01/2000 - 19/12/2018 Editorial Board - International Journal of Robotics Research Sage Press
Australia
1/01/1997 - 31/12/1999 President - Australian Robotics & Automation Association Australian Robotics & Automation Association
Australia

Professional appointment

Dates Title Organisation / Department
1/03/2012 - 16/11/2018 Chief Defence Scientist for Australia Australian Government
Australia
1/07/2007 - 1/02/2012 Group Executive, Information Sciences CSIRO - Commonwealth Scientific and Industrial Research Organisation
1/07/2004 - 1/07/2009 Director, ICT Centre CSIRO - Commonwealth Scientific and Industrial Research Organisation
1/07/2004 - 1/06/2008 Chief Technology Officer Seeing Machines Ltd
Australia
1/07/2000 - 1/07/2004 Chief Executive Officer Seeing Machines Ltd
Australia
1/03/1993 - 1/02/1995 Research Scientist, MITI
AIST, Electrotechnical Laboratory,
Japan
1/01/1978 - 1/01/1984 Computer Systems Engineer BHP Steel
Australia

Awards

Award

Year Award
2017 Officer in the General Division of Order of Australia (AO)
Australian Government
2015 MA Sargent Medal
Engineers Australia
2013 Pearcey Medal, Lifetime Achievement Award
Pearcey Foundation
2010 IEEE Inaba Technical Award for Innovation Leading to Production
IEEE
2009 Professional Engineer of the Year
NSW Division, Institution of Engineers Australia (IEAust) | Australia
2006 iAward
Australian Information Industry Association
2005 ATSE Clunies-Ross Medal
National Sceince and Technology Award
2002 iAward
Australian Information Industry Association
2002 R&D Top 100 Award
R&D Magazine
2002 Eureka Science Award
Australian Museum
2001 BHERT Award
Business-Higher Education Round Table
2001 Australian Engineering Excellence Award
Engineers Australia
1999 Australian Engineering Excellence Award
Engineers Australia

Distinction

Year Award
2016 Knowledge Nation, Top 100 Innovators in Australia
Australian Government
2016 Top 100 Most Influential Engineers (2016-2019)
Institution of Engineers Australia (IEAust)
2013 Honorary Fellowship - Institution of Engineers, Australia (IEAust)
Institution of Engineers, Australia (IEAus)
2012 Innovation Hero, Warren Centre for Advanced Engineering
Warren Centre for Advanced Engineering
2010 Boeing Global Supplier of the Year - Research & Innovation
Boeing
2008 Fellow - Institution of Engineers, Australia (IEAust)
Institution of Engineers, Australia (IEAus)
2007 Fellow - Institute of Electrical & Electronic Engineers (IEEE)
IEEE
2007 Smart 100 List of Influential Australians
Bulletin Magazine
2006 Fellow - Australian Institute of Company Directors (AICD)
Australian Institute of Company Directors
2005 Technology Pioneer
World Economic Forum
2004 Technology Pioneer
World Economic Forum
2003 Smart 100 List of Influential Australians
Bulletin Magazine
2003 Technology Pioneer
World Economic Forum
2002 Fellow - Australian Institute of Technological Sciences & Engineering (ATSE)
Australian Institute of Technological Sciences & Engineering
2001 Top 100 Global Scientific Innovations
Discover Magazine

Invitations

Keynote Speaker

Year Title / Rationale
2017 Keynote address
2017 Keynote address
2017 Keynote address
Edit

Publications

For publications that are currently unpublished or in-press, details are shown in italics.


Book (1 outputs)

Year Citation Altmetrics Link
1998 Field and Service Robotics, Springer Verlag, London, 547 (1998)
DOI 10.1007/978-1-4471-1273-0

Chapter (7 outputs)

Year Citation Altmetrics Link
2016 Broggi A, Zelinsky A, Özgüner Ü, Laugier C, 'Intelligent Vehicles', Springer Handbook of Robotics, Springer Verlag, Berlin, DE 1627-1656 (2016)
DOI 10.1007/978-3-319-32552-1_62
2014 Perera S, Barnes N, Zelinsky A, 'Exploration: Simultaneous Localization and Mapping (SLAM)', Computer Vision: A Reference Guide, Springer US, Boston, MA 268-275 (2014)
DOI 10.1007/978-0-387-31439-6_280
2006 Petersson L, Fletcher L, Barnes N, Zelinsky A, 'Smart cars: The next frontier', Advances in Applied Artificial Intelligence 120-156 (2006)

This chapter gives an overview of driver assistance systems (DAS) in general and the Smart Cars project in particular. In the Driver Assistance Systems Section, a set of key compe... [more]

This chapter gives an overview of driver assistance systems (DAS) in general and the Smart Cars project in particular. In the Driver Assistance Systems Section, a set of key competencies for an effective DAS are identified by comparing with a human co-pilot, namely, traffic situation monitoring, driver's state monitoring, vehicle state monitoring, communication with the driver, vehicle control, and a reasoning system. It is also recognised that such a system must be intuitive, non-intrusive and override-able. A few of the currently available commercial systems are mentioned in the following section. The Smart Cars project, which is a joint project between the Australian National University and National ICT Australia, is then introduced. A number of different research directions within the project are then presented in detail: obstacle detection and tracking, speed sign detection and recognition, pedestrian detection, and blind spot monitoring. © 2006, Idea Group Inc.

DOI 10.4018/978-1-59140-827-7.ch005
2002 Zelinsky A, Jung DL, 'Symbol grounding for communication', Robot Teams: From Diversity to Polymorphism, A K Peters/CRC Press, Natick, Massachusetts --- (2002)
1998 HeinzmannA J, Zelinsky A, 'A visual interface for human-robot interaction', Field and Service Robotics, Springer, New York 540-547 (1998)
1998 Brooks A, Dickins G, Zelinsky A, Kieffer J, Abdallah S, 'A High-Performance Camera Platform for Real-Time Active Vision', Field and Service Robotics, Springer, London 527-532 (1998)
DOI 10.1007/978-1-4471-1273-0_79
1998 Ward K, Zelinsky A, 'An Exploratory Robot Controller which Adapts to Unknown Environments and Damaged Sensors', Field and Service Robotics, Springer, London 456-463 (1998)
DOI 10.1007/978-1-4471-1273-0_69
Show 4 more chapters

Journal article (39 outputs)

Year Citation Altmetrics Link
2014 Zelinsky A, 'Science and technology: Supporting Australia's national security', United Service, 65 14-16 (2014)
2009 Zelinsky A, 'Trajectory Planning for Automatic Machines and Robots (Biagiotti, L. et al; 2008) [On the shelf]', IEEE ROBOTICS & AUTOMATION MAGAZINE, 16 101-101 (2009)
2009 Zelinsky A, 'Learning OpenCV¿Computer Vision with the OpenCV Library', IEEE Robotics and Automation Magazine, 16 100 (2009)
DOI 10.1109/MRA.2009.933612
Citations Scopus - 9
2009 Fletcher L, Zelinsky A, 'Driver Inattention Detection based on Eye Gaze-Road Event Correlation', INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 28 774-801 (2009)
DOI 10.1177/0278364908099459
Citations Scopus - 71Web of Science - 44
2009 Zelinsky A, 'Robot suit hybrid assistive limb', IEEE Robotics and Automation Magazine, 16 (2009)

Since 2005, the IEEE Robotics and Automation Society and International Federation of Robotics (IFR) have cooperated to host a joint Forum of Innovation and Entrepreneurship in Rob... [more]

Since 2005, the IEEE Robotics and Automation Society and International Federation of Robotics (IFR) have cooperated to host a joint Forum of Innovation and Entrepreneurship in Robotics and Automation to foster cooperation between the scientific community and industry. As part of the forum, an award for Invention and Entrepreneurship in Robotics and Automation (IERA) is made. This year the IEEE--IFR form was held in Kobe, Japan, in conjunction with the IEEE International Conference on Robotics and Automation (ICRA) 2009.

DOI 10.1109/MRA.2009.934836
Citations Scopus - 3
2008 Barnes N, Zelinsky A, Fletcher LS, 'Real-time speed sign detection using the radial symmetry detector', IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 9 322-332 (2008)
DOI 10.1109/TITS.2008.02935
Citations Scopus - 93Web of Science - 70
2008 Zelinsky A, 'Welcome. . . [Industrial Activities Board]', IEEE ROBOTICS & AUTOMATION MAGAZINE, 15 95-96 (2008)
2008 Barnes N, Zelinsky A, 'Robotics Research in Australia - A national perspective on the needs, themes, and major groups', IEEE ROBOTICS & AUTOMATION MAGAZINE, 15 89-95 (2008)
DOI 10.1109/M-RA.2007.907353
2007 Zelinsky A, 'From the guest editors - Field and service applications', IEEE Robotics and Automation Magazine, 14 6 (2007)
DOI 10.1109/MRA.2007.901321
2007 Dankers A, Barnes N, Zelinsky A, 'MAP ZDF segmentation and tracking using active stereo vision: Hand tracking case study', COMPUTER VISION AND IMAGE UNDERSTANDING, 108 74-86 (2007)
DOI 10.1016/j.cviu.2006.10.013
Citations Scopus - 6Web of Science - 2
2006 Asama H, Prassler E, Thrun S, Zelinsky A, 'Special issue on the fourth International Conference on Field and Service Robotics, 2003', International Journal of Robotics Research, 25 3-5 (2006)
DOI 10.1177/02783649506062513
2006 Petersson L, Zelinsky A, 'Towards safer roads by integration of road scene monitoring and vehicle control', FIELD AND SERVICE ROBOTICS: RECENT ADVANCES IN RESEARCH AND APPLICATIONS, 24 367-+ (2006)
2005 Zelinsky A, 'Reinventing ICT research', JOURNAL OF RESEARCH AND PRACTICE IN INFORMATION TECHNOLOGY, 37 3-10 (2005)
2005 Atienza R, Zelinsky A, 'Intuitive human-robot interaction through active 3D gaze tracking', Robotics Research, 15 172-181 (2005)
Citations Scopus - 8Web of Science - 5
2004 Corke P, Jarvis R, Zelinsky A, 'Special issue on the 10th International Symposium on Robotics Research', INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 23 99-99 (2004)
DOI 10.1177/0278364904041319
2003 Loy G, Zelinsky A, 'Fast radial symmetry for detecting points of interest', IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 25 959-973 (2003)
DOI 10.1109/TPAMI.2003.1217601
Citations Scopus - 397Web of Science - 289
2003 Chen J, Zelinsky A, 'Programing by demonstration: Coping with suboptimal teaching actions', INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 22 299-319 (2003)
DOI 10.1177/0278364903022005002
Citations Scopus - 34Web of Science - 26
2003 Halme A, Prassler E, Zelinsky A, 'Special Issue on the 3rd International Conference on Field and Service Robotics', INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 22 439-440 (2003)
2003 Fletcher L, Apostoloff N, Petersson L, Zelinsky A, 'Vision in and out of Vehicles', IEEE INTELLIGENT SYSTEMS, 18 12-17 (2003)
DOI 10.1109/MIS.2003.1200722
Citations Scopus - 51Web of Science - 26
2002 Loy G, Zelinsky A, 'A fast radial symmetry transform for detecting points of interest', COMPUTER VISON - ECCV 2002, PT 1, 2350 358-368 (2002)
Citations Scopus - 40Web of Science - 15
2002 O'Hagan RG, Zelinsky A, Rougeaux S, 'Visual gesture interfaces for virtual environments', INTERACTING WITH COMPUTERS, 14 231-250 (2002)
DOI 10.1016/S0953-5438(01)00050-9
Citations Scopus - 47Web of Science - 28
2002 Thompson S, Zelinsky A, 'Accurate local positioning using visual landmarks from a panoramic sensor', 2002 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS I-IV, PROCEEDINGS, 2656-2661 (2002)
Citations Scopus - 13Web of Science - 6
2001 Cheng G, Zelinsky A, 'Supervised autonomy: A framework for human-robot systems development', AUTONOMOUS ROBOTS, 10 251-266 (2001)
DOI 10.1023/A:1011231725361
Citations Scopus - 28Web of Science - 19
2000 Bianco G, Zelinsky A, Lehrer M, 'Visual landmark learning', IEEE International Conference on Intelligent Robots and Systems, 1 227-232 (2000)

Biology often offers valuable example of systems both for learning and for controlling motion. Work in robotics has often been inspired by these findings in diverse ways. Though, ... [more]

Biology often offers valuable example of systems both for learning and for controlling motion. Work in robotics has often been inspired by these findings in diverse ways. Though, the fundamental aspects that involve visual landmark learning and motion control mechanisms have almost exclusively been approached heuristically rather than examining the underlying principles. In this paper we introduce theoretical tools that might explain how the visual learning works and why the motion is attracted by the pre-learnt goal position. Basically, the theoretical tools emerge from the navigation vector field produced by the visual behaviors. Both the learning process and the navigation scheme influence the motion field. We apply classical mathematical and dynamic control to analyze the efficiency of our method.

DOI 10.1109/IROS.2000.894609
Citations Scopus - 9
2000 Bares J, Choset H, Zelinsky A, 'Editorial: Special issue on field and service robotics', INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 19 971-971 (2000)
DOI 10.1177/02783640022067913
2000 Jung D, Zelinsky A, 'Grounded symbolic communication between heterogeneous cooperating robots', AUTONOMOUS ROBOTS, 8 269-292 (2000)
DOI 10.1023/A:1008929609573
Citations Scopus - 47Web of Science - 33
2000 Ward K, Zelinsky A, 'Acquiring mobile robot behaviors by learning trajectory velocities', AUTONOMOUS ROBOTS, 9 113-133 (2000)
DOI 10.1023/A:1008914200569
Citations Scopus - 5Web of Science - 4
2000 Bianco G, Zelinsky A, 'Dealing with robustness in mobile robot guidance while operating with visual strategies', Proceedings-IEEE International Conference on Robotics and Automation, 4 3778-3783 (2000)

This paper introduces a theory of formally and practically analyze the robustness issues of visual guidance methods for robot navigation. The first aspect is related to the conver... [more]

This paper introduces a theory of formally and practically analyze the robustness issues of visual guidance methods for robot navigation. The first aspect is related to the convergence of the navigation system to the goal. It will be shown how the dynamic system which drives the strategies can be analyzed by using classical concepts such as the Lyapunov functions. The second aspect concerns the conservativeness of the resulting navigation vector fields. It will be shown how this deals with the repeatibility of the trials. Furthermore, the selection of the best landmarks to perform the navigation processes strongly affects the conservativeness thus providing a formal way to do landmark learning. The theory has been tested with two different visual methods that have been derived from the biological world: the snapshot model and the landmark model. The former considers a portion of the full panorama taken by a color camera to accomplish navigating actions. The latter is a more sophisticated approach which uses the most suitable visual landmarks to calculate navigation movements.

DOI 10.1109/ROBOT.2000.845320
Citations Scopus - 5
1999 Zelinsky A, Russell A, Kleeman L, 'Field and service robotics', ROBOTICS AND AUTONOMOUS SYSTEMS, 26 77-79 (1999)
DOI 10.1016/S0921-8890(98)00061-X
1999 Heinzmann J, Zelinsky A, 'A safe-control paradigm for human-robot interaction', JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 25 295-310 (1999)
DOI 10.1023/A:1008135313919
Citations Scopus - 24Web of Science - 21
1999 Jung D, Zelinsky A, 'An architecture for distributed cooperative planning in a behaviour-based multi-robot system', ROBOTICS AND AUTONOMOUS SYSTEMS, 26 149-174 (1999)
DOI 10.1016/S0921-8890(98)00066-9
Citations Scopus - 36Web of Science - 14
1999 Matsumoto Y, Ogasawara T, Zelinsky A, 'Development of Real-time Face and Gaze Measurement System', Technical Report of IEICE, 99 9-14 (1999)
1998 Zelinsky A, Heinzmann J, 'A novel visual interface for human-robot communication', ADVANCED ROBOTICS, 11 827-852 (1998)
Citations Scopus - 14Web of Science - 11
1996 Zelinsky A, Kuniyoshi Y, 'Learning to coordinate behaviors for robot navigation', ADVANCED ROBOTICS, 10 143-159 (1996)
Citations Scopus - 3Web of Science - 3
1994 ZELINSKY A, 'USING PATH TRANSFORMS TO GUIDE THE SEARCH FOR FINDPATH IN 2D', INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 13 315-325 (1994)
DOI 10.1177/027836499401300403
Citations Scopus - 85Web of Science - 78
1992 ZELINSKY A, 'A MOBILE ROBOT EXPLORATION ALGORITHM', IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, 8 707-717 (1992)
DOI 10.1109/70.182671
Citations Scopus - 156Web of Science - 107
1991 ZELINSKY A, 'MOBILE ROBOT MAP MAKING USING SONAR', JOURNAL OF ROBOTIC SYSTEMS, 8 557-577 (1991)
DOI 10.1002/rob.4620080502
Citations Scopus - 16Web of Science - 13
1990 MILWAY M, GORTON I, FULCHER J, ZELINSKY A, 'INTERFACING TRANSPUTER LINKS TO EXTERNAL DEVICES', MICROPROCESSORS AND MICROSYSTEMS, 14 644-652 (1990)
DOI 10.1016/0141-9331(90)90039-X
Citations Scopus - 5Web of Science - 2
1988 ZELINSKY A, 'Robot navigation with learning', Australian Computer Journal, 20 85-93 (1988)
Citations Web of Science - 4
Show 36 more journal articles

Conference (121 outputs)

Year Citation Altmetrics Link
2018 Zelinsky A, 'Welcome Message from the General Chair', Brisbane, QLD (2018)
2014 Heinzman J, Zelinsky A, 'Automotive safety solutions through technology and human-factors innovation', Springer Tracts in Advanced Robotics (2014)

© Springer-Verlag Berlin Heidelberg 2014. Advanced Driver Assistance Systems (ADAS) provide warnings and in some cases autonomous actions to increase driver and passenger safety b... [more]

© Springer-Verlag Berlin Heidelberg 2014. Advanced Driver Assistance Systems (ADAS) provide warnings and in some cases autonomous actions to increase driver and passenger safety by combining sensor technologies and situation awareness. In the last 10 years progressed from prototype demonstrators to full product deployment in motor vehicles. Early ADAS examples include Lane Departure Warning (LDW) and Forward Collision Warning (FCW) systems have been developed to warn drivers of potentially dangerous situations. More recently, driver inattention systems have made their debut. These systems are tackling one of the major causes of fatalities on roads-drowsiness and distraction. This paper describes DSS, a driver inattention warning system which has been developed by Seeing Machines for commercial applications, with an initial focus on heavy vehicle fleet management applications. A case study reporting a year-long realworld deployment of DSS is presented. The study showed the effectiveness of the DSS technology in mitigating driver inattention in a sustained manner.

DOI 10.1007/978-3-642-28572-1_10
Citations Scopus - 2
2012 Victor T, Blomberg O, Zelinsky A, 'Automating the Measurement of Driver Visual Behaviour Using Passive Stereo Vision', Vision in Vehicles IX. Proceedings of the 9th International Conference on Vision in Vehicles, Brisbane, Queensland (2012)
2009 Dankers A, Barnes N, Bischof WF, Zelinsky A, 'Humanoid Vision Resembles Primate Archetype', EXPERIMENTAL ROBOTICS, Athens, GREECE (2009)
Citations Scopus - 3Web of Science - 2
2009 Zelinsky A, 'Dependable autonomous systems', Zhuhai, China (2009)
DOI 10.1109/ICINFA.2009.5204879
2008 Fletcher L, Zelinsky A, 'Context sensitive driver assistance based on gaze - Road scene correlation', EXPERIMENTAL ROBOTICS, Rio de Janeiro, BRAZIL (2008)
Citations Scopus - 1
2007 Dankers A, Barnes N, Zelinsky A, 'A Reactive Vision System: Active-Dynamic Saliency', Proceedings of the 5th International Conference on Computer Vision Systems (ICVS 2007), Bielefeld, DE (2007)
DOI 10.2390/biecoll-icvs2007-91
2007 Zelinsky A, 'Message from international advisory committee chair', ISCIT 2007 - 2007 International Symposium on Communications and Information Technologies Proceedings (2007)
DOI 10.1109/ISCIT.2007.4391973
2007 Fletcher L, Zelinsky A, 'Driver state monitoring to mitigate distraction', Distracted Driving. International Conference on Distracted Driving, Sydney, Australia (2007)
2006 Dankers A, Barnes N, Zelinsky A, 'Bimodal active stereo vision', FIELD AND SERVICE ROBOTICS, Port Douglas, AUSTRALIA (2006)
2006 Petersson L, Fletcher L, Zelinsky A, Barnes N, Arnell F, 'Towards safer roads by integration of road scene monitoring and vehicle control', INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, Mt Fuji, JAPAN (2006)
DOI 10.1177/0278364906061156
Citations Scopus - 9Web of Science - 8
2005 Atienza R, Zelinsky A, 'Intuitive interface through active 3D gaze tracking', Proceedings of the 2005 International Conference on Active Media Technology, AMT 2005 (2005)

Our interaction with machines is always severely constrained by unnatural interfaces such as mouse, keyboard and joystick. Such interfaces make it difficult for us to convey our i... [more]

Our interaction with machines is always severely constrained by unnatural interfaces such as mouse, keyboard and joystick. Such interfaces make it difficult for us to convey our ideas in order for computers to understand and perform our intended tasks. In this research, our aim is to build systems that use natural human actions as interfaces. In particular we exploit gaze to track a person's focus of attention. We present an active gaze tracking system that enables a user to instruct a robot arm to pick up and hand over an object placed arbitrarily in 3D space. Our system determines the precise 3D position of the object of unknown size, shape and color by following the person's steady gaze. © 2005 IEEE.

DOI 10.1109/AMT.2005.1505258
Citations Scopus - 3
2005 Fletcher L, Petersson L, Barnes N, Austin D, Zelinsky A, 'A sign reading driver assistance system using eye gaze', 2005 IEEE International Conference on Robotics and Automation (ICRA), Vols 1-4, Barcelona, SPAIN (2005)
Citations Scopus - 5Web of Science - 1
2005 Fletcher L, Petersson L, Zelinsky A, 'Road scene monotony detection in a Fatigue Management Driver Assistance System', 2005 IEEE INTELLIGENT VEHICLES SYMPOSIUM PROCEEDINGS, Las Vegas, NV (2005)
Citations Scopus - 19Web of Science - 2
2005 Dankers A, Barnes N, Zelinsky A, 'Active vision for road scene awareness', 2005 IEEE Intelligent Vehicles Symposium Proceedings, Las Vegas, NV (2005)
Citations Scopus - 6
2005 Petersson L, Fletcher L, Zelinsky A, 'A framework for driver-in-the-loop driver assistance systems', 2005 IEEE Intelligent Transportation Systems Conference (ITSC), Vienna, AUSTRIA (2005)
Citations Scopus - 17Web of Science - 1
2005 Fletcher L, Loy G, Barnes N, Zelinsky A, 'Correlating driver gaze with the road scene for driver assistance systems', ROBOTICS AND AUTONOMOUS SYSTEMS, Sendai, JAPAN (2005)
DOI 10.1016/j.robot.2005.03.010
Citations Scopus - 56Web of Science - 41
2004 Barnes N, Zelinsky A, 'Real-time radial symmetry for speed sign detection', 2004 IEEE INTELLIGENT VEHICLES SYMPOSIUM, Parma, ITALY (2004)
Citations Scopus - 121Web of Science - 16
2004 Fletcher L, Zelinsky A, 'Super-resolving Signs for Classification', Canberra, ACT (2004)
2004 Petersson L, Fletcher L, Barnes N, Zelinsky A, 'An interactive driver assistance system monitoring the scene in and out of the vehicle', 2004 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1- 5, PROCEEDINGS, New Orleans, LA (2004)
DOI 10.1109/ROBOT.2004.1308791
Citations Scopus - 9Web of Science - 2
2004 Grubb G, Zelinsky A, Nilsson L, Rilbe M, '3D vision sensing for improved pedestrian safety', 2004 IEEE INTELLIGENT VEHICLES SYMPOSIUM, Parma, ITALY (2004)
Citations Scopus - 86Web of Science - 1
2004 Matuszyk L, Zelinsky A, Nilsson L, Rilbe M, 'Stereo panoramic vision for monitoring vehicle blind-spots', 2004 IEEE INTELLIGENT VEHICLES SYMPOSIUM, Parma, ITALY (2004)
Citations Scopus - 16Web of Science - 1
2004 Apostoloff N, Zelinsky A, 'Vision in and out of vehicles: Integrated driver and road scene monitoring', INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, SANT ANGELO, ITALY (2004)
DOI 10.1177/0278364904042206
Citations Scopus - 40Web of Science - 26
2004 Dankers A, Zelinsky A, 'CeDAR: A real-world vision system - Mechanism, control and visual processing', MACHINE VISION AND APPLICATIONS, British Machine Vis Assoc, Cambridge, ENGLAND (2004)
DOI 10.1007/s00138-004-0156-3
Citations Scopus - 25Web of Science - 17
2004 Dankers A, Barnes N, Zelinsky A, 'Active vision-rectification and depth mapping', ACRA 2004. Australasian Conference on Robotics and Automation 2004, Canberra, Australia (2004)
2003 Atienza R, Zelinsky A, 'Interactive skills using active gaze tracking', ICMI'03: Fifth International Conference on Multimodal Interfaces (2003)

We have incorporated interactive skills into an active gaze tracking system. Our active gaze tracking system can identify an object in a cluttered scene that a person is looking a... [more]

We have incorporated interactive skills into an active gaze tracking system. Our active gaze tracking system can identify an object in a cluttered scene that a person is looking at. By following the user's 3-D gaze direction together with a zero-disparity filter, we can determine the object's position. Our active vision system also directs attention to a user by tracking anything with both motion and skin color. A Particle Filter fuses skin color and motion from optical flow techniques together to locate a hand or a face in an image. The active vision then uses stereo camera geometry, Kalman Filtering and position and velocity controllers to track the feature in real-time. These skills are integrated together such that they cooperate with each other in order to track the user's face and gaze at all times. Results and video demos provide interesting insights on how active gaze tracking can be utilized and improved to make human-friendly user interfaces. Copyright 2003 ACM.

Citations Scopus - 4
2003 Halme A, Prassler E, Zelinsky A, 'Editorial: Special issue on the 3rd International Conference on Field and Service Robotics', International Journal of Robotics Research (2003)
Citations Scopus - 1
2003 Petersson L, Apostoloff N, Zelinsky A, 'Driver assistance: An integration of vehicle monitoring and control', 2003 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-3, PROCEEDINGS, TAIPEI, TAIWAN (2003)
DOI 10.1109/ROBOT.2003.1241903
Citations Scopus - 1Web of Science - 1
2003 Gaskett C, Brown P, Cheng G, Zelinsky A, 'Learning implicit models during target pursuit', 2003 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-3, PROCEEDINGS, TAIPEI, TAIWAN (2003)
Citations Scopus - 2
2003 Fletcher L, Petersson L, Zelinsky A, 'Driver assistance systems based on vision in and out of vehicles', IEEE IV2003: INTELLIGENT VEHICLES SYMPOSIUM, PROCEEDINGS, COLUMBUS, OH (2003)
DOI 10.1109/IVS.2003.1212930
Citations Scopus - 37Web of Science - 19
2003 Zelinsky A, 'Toward smart cars with computer vision for integrated driver and road scene monitoring', STUDIES IN PERCEPTION AND ACTION VII, GOLD COAST, AUSTRALIA (2003)
2003 Apostoloff N, Zelinsky A, 'Robust vision based lane tracking using multiple cues and particle filtering', IEEE Intelligent Vehicles Symposium, Proceedings (2003)

© 2003 IEEE. One of the more startling effects of road related accidents is the economic and social burden they cause. Between 750,000 and 880,000 people died globally in road rel... [more]

© 2003 IEEE. One of the more startling effects of road related accidents is the economic and social burden they cause. Between 750,000 and 880,000 people died globally in road related accidents in 1999 alone, with an estimated cost of US$518 billion. One way of combating this problem is to develop Intelligent Vehicles that are self-aware and act to increase the safety of the transportation system. This paper presents the development and application of a novel multiple-cue visual lane tracking system for research into Intelligent Vehicles (IV). Particle filtering and cue fusion technologies form the basis of the lane tracking system which robustly handles several of the problems faced by previous lane tracking systems such as shadows on the road, unreliable lane markings, dramatic lighting changes and discontinuous changes in road characteristics and types. Experimental results of the lane tracking system running at 15 Hz will be discussed, focusing on the particle filter and cue fusion technology used.

DOI 10.1109/IVS.2003.1212973
Citations Scopus - 117
2003 Thompson S, Zelinsky A, 'Accurate vision based position tracking between places in a topological map', Proceedings of IEEE International Symposium on Computational Intelligence in Robotics and Automation, CIRA (2003)

© 2003 IEEE. This paper presents a method for accurately tracking the position of a mobile robot which moves between places in a previously learned topological map. Places in the ... [more]

© 2003 IEEE. This paper presents a method for accurately tracking the position of a mobile robot which moves between places in a previously learned topological map. Places in the map are represented by sets of visual landmarks extracted from panoramic images. Probabilistic localisation methods and the landmark representation enable position tracking within places. A sensor model is presented which improves the accuracy of local position estimates, and is robust in the presence of occlusion and data association errors. Position tracking between places requires the recognition of place transition events and the passing of local position estimates between places. This paper presents such a system and reports real world position tracking results from paths through topological maps.

DOI 10.1109/CIRA.2003.1222138
Citations Scopus - 4
2003 Apostoloff N, Zelinsky A, 'Vision in and out of vehicles: Integrated driver and road scene monitoring', EXPERIMENTAL ROBOTICS VIII, SANT ANGELO, ITALY (2003)
2003 Dankers A, Zelinsky A, 'A real-world vision system: Mechanism, control, and vision processing', COMPUTER VISION SYSTEMS, PROCEEDINGS, GRAZ, AUSTRIA (2003)
Citations Scopus - 1
2003 Heinzmann J, Zelinsky A, 'Quantitative safety guarantees for physical human-robot interaction', INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH (2003)
DOI 10.1177/02783649030227004
Citations Scopus - 113Web of Science - 103
2003 Dankers A, Fletcher L, Petersson L, Zelinsky A, 'Driver assistance: Contemporary road safety', ACRA 2003. Australasian Conference on Robotics Automation 2003, Brisbane, Australia (2003)
2002 Loy G, Fletcher L, Apostoloff N, Zelinsky A, 'An adaptive fusion architecture for target tracking', FIFTH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, PROCEEDINGS, WASHINGTON, D.C. (2002)
DOI 10.1109/AFGR.2002.1004164
Citations Scopus - 49Web of Science - 10
2002 Atienza R, Zelinsky A, 'Active gaze tracking for human-robot interaction', FOURTH IEEE INTERNATIONAL CONFERENCE ON MULTIMODAL INTERFACES, PROCEEDINGS, PITTSBURGH, PA (2002)
DOI 10.1109/ICMI.2002.1167004
Citations Scopus - 19Web of Science - 9
2002 Bianco GM, Zelinsky A, 'The convergence property of goal-based visual navigation', IEEE International Conference on Intelligent Robots and Systems (2002)

The use of landmarks is a natural and instinctive method to determine the whereabouts of a location or a means to proceed to a particular location. Results provided in this paper ... [more]

The use of landmarks is a natural and instinctive method to determine the whereabouts of a location or a means to proceed to a particular location. Results provided in this paper indicate that landmark-based navigation possesses a corrective or feedback trait that produces a convergence bound on the movements to the goal position, in contrast to the odometry-based movements, which leads to the drift between successive navigation movements. Experiments show that the vector field approach can be used to explain the convergence property of landmark-based guidance tasks. Experiments have been carried out operating with a Nomad mobile robot equipped with real-time visual landmark tracking system.

Citations Scopus - 2
2002 Petersson L, Apostoloff N, Zelinsky A, 'Driver assistance based on vehicle monitoring and control', Proceedings of the 2002 Australasian Conference on Robotics and Automation, Auckland, NZ (2002)
2001 Atienza R, Zelinsky A, 'A practical zoom camera calibration technique: An application of active vision for human-robot interaction', ACRA 2001. Australian Conference on Robotics and Automation 2001, Sydney, Australia (2001)
2001 Göcke R, Millar JB, Zelinsky A, Robert-Ribes J, 'Stereo Vision Lip-Tracking for Audio-Video Speech Processing', Salt Lake City (2001)
2001 Chen JR, Zelinsky A, 'Generating a configuration space representation for assembly tasks from demonstration', 2001 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS I-IV, PROCEEDINGS, SEOUL, SOUTH KOREA (2001)
Citations Scopus - 1
2001 Chen JR, Zelinsky A, 'Programming by demonstration: Removing suboptimal actions in a partially known configuration space', 2001 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS I-IV, PROCEEDINGS, SEOUL, SOUTH KOREA (2001)
Citations Scopus - 5
2001 Austin D, Fletcher L, Zelinsky A, 'Mobile robotics in the long term - Exploring the fourth dimension', IROS 2001: PROCEEDINGS OF THE 2001 IEEE/RJS INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, MAUI, HI (2001)
Citations Scopus - 13Web of Science - 4
2001 Silpa-Anan C, Brinsmead T, Abdallah S, Zelinsky A, 'Preliminary experiments in visual servo control for autonomous underwater vehicle', IROS 2001: PROCEEDINGS OF THE 2001 IEEE/RJS INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, MAUI, HI (2001)
Citations Scopus - 22Web of Science - 10
2001 Sutherland O, Truong H, Rougeaux S, Zelinsky A, 'Advancing active vision systems by improved design and control', EXPERIMENTAL ROBOTICS VII, WAIKIKI, HAWAII (2001)
2001 Heinzmann J, Zelinsky A, 'Visual human-robot interaction', 2001 INTERNATIONAL WORKSHOP ON BIO-ROBOTICS AND TELEOPERATION, PROCEEDINGS, BEIJING INST TECHNOL, BEIJING, PEOPLES R CHINA (2001)
2001 Silpa-Anan C, Zelinsky A, 'Kambara: past, present and future', ACRA 2001 : Proceedings of the 2001 Australian Conference on Robotics and Automation, Sydney, NSW (2001)
2001 Goecke R, Millar JB, Zelinsky A, Robert-Ribes J, 'Analysis of audio-video correlation in vowels in Australian English', AVSP 2001 International Conference on Auditory-Visual Speech Processing, Aalborg, Denmark (2001)
2001 Fletcher L, Apostoloff N, Chen J, Zelinsky A, 'Computer vision for vehicle monitoring and control', Proceedings 2001 Australian Conference on Robotics and Automation, Sydney, NSW (2001)
2000 Truong H, Abdallah S, Rougeaux S, Zelinsky A, 'A novel mechanism for stereo active vision', ACRA 2000. Australian Conference on Robotics and Automation 2000, Sydney, Australia (2000)
2000 Bianco G, Zelinsky A, 'Real time analysis of the robustness of the navigation strategy of a visually guided mobile robot', Intelligent Autonomous Systems 6, Venice, Italy (2000)
2000 Cvetanovski J, Abdallah S, Brinsmead T, Zelinsky A, Wettergreen D, 'A state estimation system for an autonomous underwater vehicle', ACRA 2000. Australian Conference on Robotics and Automation 2000, Melbourne, Australia (2000)
2000 Gaskett C, Fletcher L, Zelinsky A, 'Reinforcement learning for visual servoing of a mobile robot', ACRA 2000. Australian Conference on Robotics and Automation 2000, Melbourne Australia (2000)
2000 Bryant M, Wettergreen D, Abdallah S, Zelinsky A, 'Robust camera calibration for an autonomous underwater vehicle', ACRA 2000. Australian Conference on Robotics and Automation 2000, Melbourne, Australia (2000)
2000 Loy G, Goecke R, Rougeaux S, Zelinsky A, 'Stereo 3D Lip Tracking', Proceedings of the 6th International Conference on Control, Automation, Robotics and Vision ICARCV2000, Singapore (2000)
2000 Sutherland O, Rougeaux S, Abdallah S, Zelinsky A, 'Tracking with hybrid-drive active vision', Melbourne, Vic (2000)
2000 Goecke R, Tran QN, Zelinsky A, Millar JB, Robert-Ribes J, 'Validation of an automatic lip-tracking algorithm and design of a database for audio-video speech processing', Canberra, ACT (2000)
2000 Gaskett C, Fletcher L, Zelinsky A, 'Reinforcement learning for a vision based mobile robot', 2000 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2000), VOLS 1-3, PROCEEDINGS, KAGAWA UNIV, TAKAMATSU, JAPAN (2000)
Citations Scopus - 20Web of Science - 2
2000 O'Hagan R, Zelinsky A, 'Visual gesture interfaces for virtual environments', Proceedings - 1st Australasian User Interface Conference, AUIC 2000 (2000)

© 2000 IEEE. Virtual environments provide a whole new way of viewing and manipulating 3D data. Current technology moves the images out of desktop monitors and into the space immed... [more]

© 2000 IEEE. Virtual environments provide a whole new way of viewing and manipulating 3D data. Current technology moves the images out of desktop monitors and into the space immediately surrounding the user. Users can literally put their hands on the virtual objects. Unfortunately techniques for interacting with such environments have yet to mature. Gloves and sensor based trackers are unwieldy, constraining and uncomfortable to use. A natural, more intuitive method of interaction would be to allow the user to grasp objects with their hands and manipulate them as if they were real objects. We are investigating the use of computer vision in implementing a natural interface based on hand gestures. A framework for a gesture recognition system is introduced along with results of experiments in colour segmentation, feature extraction and template matching for finger and hand tracking and hand pose recognition. Progress in the implementation of a gesture interface for navigation and object manipulation in virtual environments is discussed.

DOI 10.1109/AUIC.2000.822069
Citations Scopus - 15
2000 Newman R, Matsumoto Y, Rougeaux S, Zelinsky A, 'Real-time stereo tracking for head pose and gaze estimation', Proceedings - 4th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2000 (2000)

Computer systems which analyse human face/head motion have attracted significant attention recently as there are a number of interesting and useful applications. Not least among t... [more]

Computer systems which analyse human face/head motion have attracted significant attention recently as there are a number of interesting and useful applications. Not least among these is the goal of tracking the head in real time. A useful extension of this problem is to estimate the subject's gaze point in addition to his/her head pose. This paper describes a real-time stereo vision system which determines the head pose and gaze direction of a human subject. Its accuracy makes it useful for a number of applications including human/computer interaction, consumer research and ergonomic assessment. © 2000 IEEE.

DOI 10.1109/AFGR.2000.840622
Citations Scopus - 79
2000 Matsumoto Y, Zelinsky A, 'An algorithm for real-time stereo vision implementation of head pose and gaze direction measurement', Proceedings - 4th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2000 (2000)

To build smart human interfaces, it is necessary for a system to know a user's intention and point of attention. Since the motion of a person's head pose and gaze direct... [more]

To build smart human interfaces, it is necessary for a system to know a user's intention and point of attention. Since the motion of a person's head pose and gaze direction are deeply related with his/her intention and attention, detection of such information can be utilized to build natural and intuitive interfaces. We describe our real-time stereo face tracking and gaze detection system to measure head pose and gaze direction simultaneously. The key aspect of our system is the use of real-time stereo vision together with a simple algorithm which is suitable for real-time processing. Since the 3D coordinates of the features on a face can be directly measured in our system, we can significantly simplify the algorithm for 3D model fitting to obtain the full 3D pose of the head compared with conventional systems that use monocular camera. Consequently we achieved a non-contact, passive, real-time, robust, accurate and compact measurement system for head pose and gaze direction. © 2000 IEEE.

DOI 10.1109/AFGR.2000.840680
Citations Scopus - 240
2000 Zelinsky A, Matsumoto Y, Heinzmann J, Newman R, 'Towards human friendly robots: Vision-based interfaces and safe mechanisms', EXPERIMENTAL ROBOTICS VI, SYDNEY, AUSTRALIA (2000)
Citations Web of Science - 2
2000 Heinzmann J, Zelinsky A, 'Building human-friendly robot systems', ROBOTICS RESEARCH, SNOWBIRD, UT (2000)
2000 Matsumoto Y, Ogasawara T, Zelinsky A, 'Behavior recognition based on head pose and gaze direction measurement', IEEE International Conference on Intelligent Robots and Systems (2000)

To build smart human interfaces, it is necessary for a system to know a user's intention and point of attention. Since the motion of a person's head pose and gaze direct... [more]

To build smart human interfaces, it is necessary for a system to know a user's intention and point of attention. Since the motion of a person's head pose and gaze direction are deeply related with his/her intention and attention, detection of such information can be utilized to build natural and intuitive interfaces. In this paper, we describe a behavior recognition system based on the real-time stereo face tracking and gaze detection system to measure head pose and gaze direction simultaneously. The key aspect of our system is the use of real-time stereo vision together with a simple algorithm which is suitable for real-time processing. Since the 3D coordinates of the features on a face can be directly measured in our system, we can significantly simplify the algorithm for 3D model fitting to obtain the full 3D pose of the head compared with conventional systems that use monocular camera. Consequently we achieved a non-contact, passive, real-time, robust, accurate and compact measurement system for head pose and gaze direction. The recognition of attentions and gestures of a person is demonstrated in the experiments.

Citations Scopus - 42
2000 Göcke R, Millar JB, Zelinsky A, Robert-Ribes J, 'Automatic extraction of lip feature points', ACRA 2000. Australian Conference on Robotics and Automation 2000, Melbourne, Australia (2000)
2000 Oh S, Zelinsky A, Taylor K, 'Autonomous battery recharging for indoor mobile robots', ACRA 2000. Australian Conference on Robotics and Automation 2000, Melbourne, Australia (2000)
2000 Thompson S, Matsui T, Zelinsky A, 'Localisation using automatically selected landmarks from panoramic images', Proceedings of Australian Conference on Robotics and Automation (ACRA2000), Melbourne, Australia (2000)
1999 Rahman S, Zelinsky A, 'Mobile robot navigation based on localisation using hidden Markov models', ACRA 1999. Australian Conference on Robotics and Automation 1999, Brisbane, Australia (1999)
1999 Sotelo MÁ, Zelinsky A, Rodríguez FJ, Bergasa LM, 'Real-time Road Tracking using Templates Matching', IIA/SOCO 1999: Proceedings of the Third ICSC Symposia on Intelligent Industrial Automation (IIA'99) and Soft Computing (SOCO'99), Genoa, Italy (1999)
1999 Gaskett C, Wettergreen D, Zelinsky A, 'Reinforcement learning applied to the control of an autonomous underwater vehicle', ACRA 1999. Australian Conference on Robotics and Automation 1999, Brisbane, Australia (1999)
1999 Wettergreen D, Gaskett C, Zelinsky A, 'Reinforcement learning for a visually-guided autonomous underwater vehicle', Proceedings of the 11th International Symposium on Unmanned Untethered Submersible Technology, Durham, New Hampshire (1999)
1999 Cheng G, Zelinsky A, 'Supervised Autonomy: a framework for human robot systems development', Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (1999)

In this paper we present a paradigm for robot control, Supervised Autonomy. Supervised Autonomy is a framework, which facilitates the development of human robot systems. Each of t... [more]

In this paper we present a paradigm for robot control, Supervised Autonomy. Supervised Autonomy is a framework, which facilitates the development of human robot systems. Each of these components have been devised to augment users in accomplishing their task. Experimental results of this framework in applying to the use of a teleoperation system are presented. Our current progress and planned future work is also presented.

Citations Scopus - 2
1999 Hara I, Zelinsky A, Matsui T, Asoh H, Kurita T, Tanaka M, Hotta K, 'Communicative functions to support human robot cooperation', IEEE International Conference on Intelligent Robots and Systems (1999)

We have been developing an autonomous robotic agent that helps people in a real world environment, such as in an office. When a robotic agent works by cooperating with person in a... [more]

We have been developing an autonomous robotic agent that helps people in a real world environment, such as in an office. When a robotic agent works by cooperating with person in a real world environment, it must manage a lot of information and deal with the knowledge and languages that people usually use. Therefore it is important for the agent to recognize what people request as soon as possible. To realize common communication with people, the agent should provide robust communicative functions to obtain information from people. In this paper, we discuss communicative functions of our robotic agent called Jijo-2. Especially we focus on the problem of detecting human faces, and discuss how a method of detecting a human face can be robustly archived.

Citations Scopus - 7
1999 Matsumoto Y, Zelinsky A, 'Real-Time stereo face tracking system for visual human interfaces', Proceedings - International Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, RATFG-RTS 1999 (1999)

When a person instructs operations to a robot, or performs a cooperative task with a robot, it is necessary to inform the robot of the person's intention and attention. Since... [more]

When a person instructs operations to a robot, or performs a cooperative task with a robot, it is necessary to inform the robot of the person's intention and attention. Since the motion of a person's face and the direction of the gaze is deeply related with person's intention and attention, detection of such motions can be utilized as a natural way of communication in human- robot interaction. In this paper, we describe our real-Time stereo face tracking system. The key of our system is the use of a stereo vision. Since the 3D coordinates of the features on the face can be directly measured in our system, we can drastically simplify the algorithm of the 3D model fitting to obtain the full 3D pose of the head compared with convention-Al system with monocular camera. Consequently we achieved a non-contact, passive, real-Time, robust, accurate and compact measurement system of human's head pose and gaze direction.

DOI 10.1109/RATFG.1999.799227
Citations Scopus - 6
1999 Gaskett C, Wettergreen D, Zelinsky A, 'Q-learning in continuous state and action spaces', ADVANCED TOPICS IN ARTIFICIAL INTELLIGENCE, UNIV NEW S WALES, SYDNEY, AUSTRALIA (1999)
Citations Scopus - 43Web of Science - 17
1999 Zelinsky A, 'Visual human-machine interaction', ADVANCED TOPICS IN ARTIFICIAL INTELLIGENCE, UNIV NEW S WALES, SYDNEY, AUSTRALIA (1999)
1999 Matsumoto Y, Zelinsky A, 'Real-time face tracking system for human-robot interaction', Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (1999)

When a person instructs operations to a robot, or performs a cooperative task with a robot, it is necessary to inform the robot of the person's intention and attention. Since... [more]

When a person instructs operations to a robot, or performs a cooperative task with a robot, it is necessary to inform the robot of the person's intention and attention. Since the motion of a person's face and the direction of the gaze is deeply related with person's intention and attention, detection of such motions can be utilized as a natural way of communication in human-robot interaction. In this paper, we describe our real-time stereo face tracking system. The key of our system is the use of a stereo vision. Since the 3D coordinates of the features on the face can be directly measured in our system, we can drastically simplify the algorithm of the 3D model fitting to obtain the full 3D pose of the head compared with conventional system with monocular camera. Consequently we achieved a non-contact, passive, real-time, robust, accurate and compact measurement system of human's head pose and gaze direction.

Citations Scopus - 23
1999 Koshizen T, Bartlett P, Zelinsky A, 'Sensor fusion of odometry and sonar sensors by the Gaussian Mixture Bayes' technique in mobile robot position estimation', Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (1999)

Modelling and reducing uncertainty are two essential problems with mobile robot localisation. Previously we developed a robot localisation system, namely the Gaussian Mixture of B... [more]

Modelling and reducing uncertainty are two essential problems with mobile robot localisation. Previously we developed a robot localisation system, namely the Gaussian Mixture of Bayes with Regularised Expectation Maximisation (GMB-REM), using sonar sensors. GMB-REM allows a robot's position to be modelled as a probability distribution, and uses Bayes' theorem to reduce the uncertainty of its location. In this paper, a new system for performing sensor fusion is introduced, namely an enhanced form of GMB-REM. Empirical results show the new system outperforms GMB-REM using sonar alone. More specifically, it is able to constrain the error of robot's positions even when sonar signals are noisy.

Citations Scopus - 7
1999 Zelinsky A, 'Advances in robot vision: Mechanisms and algorithms', First Asian Symposium on Industrial Automation and Robotics, Bangkok, Thailand (1999)
1999 Bianco G, Zelinsky A, 'Biologically-inspired visual landmark learning and navigation for mobile robots', IEEE International Conference on Intelligent Robots and Systems (1999)

This paper presents a biologically-inspired method for navigating using visual landmarks which have been self-selected within natural environments. A landmark is a region of the g... [more]

This paper presents a biologically-inspired method for navigating using visual landmarks which have been self-selected within natural environments. A landmark is a region of the grabbed image which is chosen according to its reliability measured through a phase (Turn Back and Look - TBL) that mimics the behavior of some social insects. From the self-chosen landmarks suitable navigation information can be extracted following a well known model introduced in Biology to explain the bee's navigation behavior. The landmark selection phase affects the conservativeness of the navigation vector field thus allowing us to explain the navigation model in terms of a visual potential function which drives the navigation to the goal. The experiments have been performed using a Nomad200 mobile robot equipped with monocular color vision.

Citations Scopus - 14
1999 Jung D, Zelinsky A, 'Integrating spatial and topological navigation in a behaviour-based multi-robot application', IEEE International Conference on Intelligent Robots and Systems (1999)

According to the behaviour-based philosophy, the structure of an agent's internal representations of the environment should not be explicitly imposed by the designer; they sh... [more]

According to the behaviour-based philosophy, the structure of an agent's internal representations of the environment should not be explicitly imposed by the designer; they should be grounded in its sensor-action space. This paper presents a scheme in which the agent's action selection mechanism gives rise to an integrated spatial and topological navigation and mapping capability. The navigation behaviour emerges from the notion of location feature detectors and homogeneous action selection. The scheme is demonstrated using two autonomous mobile robots in a multi-robot cooperation scenario.

Citations Scopus - 3
1999 Heinzmann J, Zelinsky A, 'Safe control of human-friendly robots', IEEE International Conference on Intelligent Robots and Systems (1999)

This paper introduces a new approach to the control of robot manipulators in a way that is safe for humans in the robot's workspace. Conceptually the robot is viewed as a too... [more]

This paper introduces a new approach to the control of robot manipulators in a way that is safe for humans in the robot's workspace. Conceptually the robot is viewed as a tool with limited autonomy. The limited perception capabilities of automatic systems prohibits the construction of failsafe robots with the capabilities of people. Instead, the goal of our control scheme is to make the interaction with a robot manipulator safe by making the robots actions predictable and understandable to the human operator. At the same time the forces the robot applies with any part of its body to its environment have to be controllable and limited. Experimental results are presented of a human-friendly robot controller that is under development for a Barrett Whole Arm Manipulator robot.

Citations Scopus - 8
1999 Ward K, Zelinsky A, McKerrow P, 'Learning to avoid objects and dock with a mobile robot', ACRA 1999. Australian Conference on Robotics and Automation 1999, Brisbane, Australia (1999)
1999 Newman R, Zelinsky A, 'Error analysis of head pose and gaze direction from stereo vision', ACRA 1999. Australian Conference on Robotics and Automation 1999, Brisbane, Australia (1999)
1999 Thompson S, Zelinsky A, Srinivasan M, 'Automatic landmark selection for navigation with panoramic vision', ACRA 1999. Australian Conference on Robotics and Automation 1999, Brisbane, Australia (1999)
1999 Truong SN, Kieffer J, Zelinsky A, 'A cable-driven pan-tilt mechanism for active vision', ACRA 1999. Australian Conference on Robotics and Automation 1999, Brisbane, Australia (1999)
1999 Heinzmann J, Zelinsky A, 'Bounding Errors for Improved 3D Face Tracking in Visual Interfaces', ACRA 1999. Australian Conference on Robotics and Automation 1999, Brisbane, Australia (1999)
1999 Loy G, Newman R, Zelinsky A, Moore J, 'An Alternative Approach to Recovering 3D Pose Information from 2D Data', Proceedings of Australian Conference on Robotics and Automation ACRA'99, Brisbane, Australia (1999)
1999 Wettergreen D, Gaskett C, Zelinsky A, 'Autonomous guidance and control for an underwater robotic vehicle', Proceedings of the International Conference on Field and Service Robotics (FSR'99), Pittsburgh, USA (1999)
1998 Jung D, Cheng G, Zelinsky A, 'Robot cleaning: An application of distributed planning and real-time vision', Field and Service Robotics, Canberra, Australia (1998)
DOI 10.1007/978-1-4471-1273-0_30
1998 Cheng G, Zelinsky A, 'Real-time vision processing for a soccer playing mobile robot', RoboCup-97: Robot Soccer World Cup I, Nagoya, Japan (1998)
DOI 10.1007/3-540-64473-3_56
1998 Ward K, Zelinsky A, 'Acquiring mobile robot behaviors by learning trajectory velocities with multiple FAM matrices', 1998 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, KATHOLIEKE UNIV LEUVEN, LEUVEN, BELGIUM (1998)
Citations Scopus - 1Web of Science - 1
1998 Jung D, Heinzmann J, Zelinsky A, 'Range and pose estimation for visual Servoing of a mobile robot', 1998 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, KATHOLIEKE UNIV LEUVEN, LEUVEN, BELGIUM (1998)
Citations Web of Science - 4
1998 Cheng G, Zelinsky A, 'Goal-oriented behaviour-based visual navigation', 1998 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, KATHOLIEKE UNIV LEUVEN, LEUVEN, BELGIUM (1998)
Citations Scopus - 9Web of Science - 3
1998 Brooks A, Abdallah S, Zelinsky A, Kieffer J, 'A multimodal approach to real-time active vision', 1998 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS - PROCEEDINGS, VOLS 1-3, VICTORIA, CANADA (1998)
Citations Scopus - 2
1998 Wettergreen D, Gaskett C, Zelinsky A, 'Development of a visually-guided autonomous underwater vehicle', OCEANS'98 - CONFERENCE PROCEEDINGS, VOLS 1-3, NICE, FRANCE (1998)
Citations Scopus - 23Web of Science - 2
1998 Heinzmann J, Zelinsky A, '3-D facial pose and gaze point estimation using a robust real-time tracking paradigm', AUTOMATIC FACE AND GESTURE RECOGNITION - THIRD IEEE INTERNATIONAL CONFERENCE PROCEEDINGS, NARA, JAPAN (1998)
DOI 10.1109/AFGR.1998.670939
Citations Scopus - 79Web of Science - 48
1998 Jung D, Cheng G, Zelinsky A, 'Experiments in realising cooperation between autonomous mobile robots', EXPERIMENTAL ROBOTICS V, UNIV POLITECNICA CATALUNYA, BARCELONA, SPAIN (1998)
Citations Web of Science - 1
1998 Ward K, Zelinsky A, 'Genetically Evolving Robot Perception to Effectively Learn Multiple Behaviours Simultaneously', Second IEEE International Conference on Intelligent Processing Systems, Gold Coast, Australia (1998)
1997 Heinzmann J, Zelinsky A, 'Robust real-time face tracking and gesture recognition', IJCAI-97 - PROCEEDINGS OF THE FIFTEENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOLS 1 AND 2, NAGOYA, JAPAN (1997)
Citations Scopus - 31Web of Science - 4
1997 O'hagan R, Zelinsky A, 'Finger track - A robust and real-time gesture interface', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (1997)

© Springer-Verlag Berlin Heidelberg 1997. Real-time computer vision combined with robust gesture recognition provides a natural alternative to traditional computer interfaces. Hum... [more]

© Springer-Verlag Berlin Heidelberg 1997. Real-time computer vision combined with robust gesture recognition provides a natural alternative to traditional computer interfaces. Human users have plenty of experience with actions and the manipulation of objects requiring finger movement. In place of a mouse, users could use their hands to select and manipulate data. This paper presents a first step in this approach using a finger as a pointing and selection device. A major feature of a successful tracking system is robustness. The system must be able to acquire tracked features upon startup, and reacquire them if lost during tracking. Reacquisition should be fast and accurate (i.e. it should pick up the correct feature). Intelligent search algorithms are needed for speedy, accurate acquisition of lost features with the frame. The prototype interface presented in this paper is based on finger tracking as a means of input to applications. The focus of the discussion is how the system can be made to perform robustly in real-time. Dynamically distributed search windows are defined for searching within the frame. The location and number of search windows are dependent on the confidence in the tracking of features. Experimental results showing the effectiveness of these techniques are presented.

Citations Scopus - 25
1996 Cheng G, Zelinsky A, 'Supervised autonomy: A paradigm for teleoperating mobile robots', IROS '97 - PROCEEDINGS OF THE 1997 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOT AND SYSTEMS: INNOVATIVE ROBOTICS FOR REAL-WORLD APPLICATIONS, VOLS 1-3, GRENOBLE, FRANCE (1996)
Citations Scopus - 9
1996 Jung D, Zelinsky A, 'Whisker based mobile robot navigation', IROS 96 - PROCEEDINGS OF THE 1996 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS - ROBOTIC INTELLIGENCE INTERACTING WITH DYNAMIC WORLDS, VOLS 1-3, SENRI LIFE SCI CTR, OSAKA, JAPAN (1996)
Citations Scopus - 28Web of Science - 10
1996 Cheng G, Zelinsky A, 'Real-time visual behaviours for navigating a mobile robot', IROS 96 - PROCEEDINGS OF THE 1996 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS - ROBOTIC INTELLIGENCE INTERACTING WITH DYNAMIC WORLDS, VOLS 1-3, SENRI LIFE SCI CTR, OSAKA, JAPAN (1996)
Citations Scopus - 31Web of Science - 7
1996 Zelinsky A, Heinzmann J, 'Human-robot interaction using facial gesture recognition', RO-MAN '96 - 5TH IEEE INTERNATIONAL WORKSHOP ON ROBOT AND HUMAN COMMUNICATION, PROCEEDINGS, AIST TSUKUBA RES CTR, TSUKUBA, JAPAN (1996)
DOI 10.1109/ROMAN.1996.568837
Citations Scopus - 8Web of Science - 5
1996 Zelinsky A, Heinzmann J, 'Real-time visual recognition of facial gestures for human-computer interaction', PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, KILLINGTON, VT (1996)
DOI 10.1109/AFGR.1996.557290
Citations Scopus - 25Web of Science - 14
1995 Cheng G, Zelinsky A, 'A physically grounded search in a behaviour based robot', AI '95 Proceedings of the Eighth Australian Joint Conference on Artificial Intelligence, Canberra, Australia (1995)
DOI 10.1142/2955
1995 Zelinsky A, Kuniyoshi Y, Suehiro T, Tsukune H, 'Using an augmentable resource to robustly and purposefully navigate a robot', Proceedings - IEEE International Conference on Robotics and Automation (1995)

We present a scheme for specifying and executing purposive navigation tasks for a behaviour-based mobile robot. A user specifies the robot's navigation task in general and qu... [more]

We present a scheme for specifying and executing purposive navigation tasks for a behaviour-based mobile robot. A user specifies the robot's navigation task in general and qualitative terms using a graphical resource called the Purposive Map (PM). The robot navigates using the incomplete and approximate information stored in the PM as an aid to achieve the specified mission. The robot is able to augment the knowledge provided in the PM with environment information learnt by the robot. Using the augmented PM, the robot learns how to perform efficient obstacle avoidance. We present experimental results using a real robot to show our scheme is robust. Our robot can escape from dead-ends, can deduce that goals are unreachable and can withstand disturbances to the environment between missions.

1994 Zelinsky A, Kuniyoshi Y, Tsukune H, 'Monitoring and co-ordinating behaviours for purposive robot navigation', IEEE/RSJ/GI International Conference on Intelligent Robots and Systems (1994)

This paper presents a new scheme for purposive navigation for mobile agents. The new scheme is robust, qualitative and provides a mechanism for combining mapping, planning and mis... [more]

This paper presents a new scheme for purposive navigation for mobile agents. The new scheme is robust, qualitative and provides a mechanism for combining mapping, planning and mission execution for a mobile agent into a single data structure called the Purposive Map (PM). The agent can navigate using incomplete and approximate information stored the PM. We present a novel approach to finding how to perform obstacle avoidance for a behaviour based robot. Our approach is based on using a physically grounded search while monitoring and co-ordinating behaviours. The physically grounded search exploits stagnation points (local minima) to guide the search for the shortest path to a target. This scheme enables our robot to escape from dead-end situations and allows it to deduce that a target location is unreachable. Simulation results are presented.

Citations Scopus - 1
1993 Zelinsky A, Yuta S, 'A unified approach to planning, sensing and navigation for mobile robots', Experimental Robotics III, The 3rd International Symposium, Kyoto, Japan, October 28-30, 1993. Lecture Notes in Control and Information Sciences 200, Kyoto, Japan (1993)
1993 Zelinsky A, Jarvis RA, Byrne JC, Yuta S, 'Planning Paths of Complete Coverage of an Unstructured Environment by a Mobile Robot', Proceedings of International Conference on Advanced Robotics, Tsukuba ,Japan (1993)
1992 Zelinsky A, 'Mobile robot navigation - Combining local obstacle avoidance and global path planning', AI '92. Proceedings of the 5th Australian Joint Conference on Artificial Intelligence, Hobart, Tasmania (1992)
1992 Zelinsky A, Dowson I, 'Continuous smooth path execution for an autonomous guided vehicle (AGV)', Melbourne, Vic (1992)
1992 Zelinsky A, 'A navigation algorithm for industrial mobile robots', Melbourne, Vic (1992)
1990 Zelinsky A, 'A Mobile Robot Control System based on a Transputer Network', The Transputer in Australasia. ATOUG-3. Proceedings of the 3rd Australian Transputer and OCCAM User Group Conference, Sydney, Australia (1990)
1990 Zelinsky A, 'Environment mapping with a mobile robot using sonar', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (1990)

© Springer-Verlag Berlin Heidelberg 1990. This paper describes a method of producing high resolution maps of an indoor environment with an autonomous mobile robot equipped with so... [more]

© Springer-Verlag Berlin Heidelberg 1990. This paper describes a method of producing high resolution maps of an indoor environment with an autonomous mobile robot equipped with sonar range finding sensors. This method is based upon investigating obstacles in the near vicinity of a mobile robot. The mobile robot examines the straight line segments extracted from the sonar range data describing obstacles near the robot. The mobile robot then moves parallel to the straight line sonar segments, in close proximity to the obstacles, continually applying the sonar barrier test. The sonar barrier test exploits the physical constraints of sonar data, and eliminates noisy data. This test determines whether or not a sonar line segment is a true obstacle edge or a false reflection. Low resolution sonar sensors can be used with the described method. The performance of the algorithm is demonstrated using a Denning Corp. Mobile Robot, equipped with a ring of Polaroid Corp. Ultrasonic Rangefinders.

DOI 10.1007/3-540-52062-7_90
1989 Zelinsky A, 'Navigation By Learning', Proceedings. IEEE/RSJ International Workshop on Intelligent Robots and Systems '. (IROS '89) 'The Autonomous Mobile Robots and Its Applications, Tsukuba, Japan (1989)
DOI 10.1109/IROS.1989.637926
1988 Zelinsky A, 'Robot navigation with learning', Australian Computer Science Communications, St Lucia, QLD (1988)
Citations Scopus - 1
Show 118 more conferences

Thesis / Dissertation (1 outputs)

Year Citation Altmetrics Link
1991 Zelinsky A, Environment exploration and path planning algorithms for a mobile robot using sonar, University of Wollongong (1991)
Edit

News

Welcome Professor Alex Zelinsky AO

November 19, 2018

The University of Newcastle is delighted to welcome Professor Alex Zelinsky AO, who commences today as the University’s 8th Vice-Chancellor and President.

New Vice-Chancellor announced

June 20, 2018

The Council of The University of Newcastle has announced the appointment of Dr Alex Zelinsky AO as its next Vice-Chancellor and President, commencing in early November 2018.

Professor Alex Zelinsky

Position

Vice-Chancellor
Office of the Vice-Chancellor
Vice-Chancellor's Division

Contact Details

Email alex.zelinsky@newcastle.edu.au
Edit