ANDRES GONGORA

I am a PhD. Student at the Machine Perception and Autonomous Intelligent Robotics (MAPIR) group, at the Univerity of Malaga, researching new technologies to improve the sensory capabilities of mobile robots.

Research Interests

  • Mobile Robotics: obstacle avoidance and navigation
  • Embedded Systems: hardware and software development
  • Spatial Point Processes: generation of gas/wind maps from sparse measurements
  • Unmanned Aerial Vehicles (UAVs): controller design and flight control

Brief CV

I was born in Schaffhausen (Switzerland) in 1990, close to the German border, and soon after moved to southern Spain at the age of 8. There I started to develop a strong curiosity for information technologies and electronics in general (I used to take our HiFi apart to “improve” its reception, something of which my parents were not precisely enthusiastic about). Still, my family supported me all along and years later I found myself studying at the University of Malaga. There I received my B.Sc in Industrial Engineering specialized in Electronics (2012), my B.Sc in Electronic Engineering (2014), and my M.Sc in Software Engineering and Artificial Intelligence (2015).

It was also during these years that small UAVs, and drones, in particular, emerged with great strength. They obviously fascinated me to the point where I wanted to build my own, but it had to be different. I wanted mine to be fully autonomous and capable of serious onboard computation. This got me eventually into the MAPIR research group in 2014, where I subsequently started my Ph.D.

Although I still have a long way to go, I had so far the chance to work with autonomous robots, RGBD cameras, ROS,  artificial olfaction and gas distribution map generation; as well as to be a lecturer of several subjects related to computer science and mobile robotics.

[DOWNLOAD MY FULL CV]

[VISIT MY PERSONAL BLOG]

 

 

Contact info


Andres Gongora
andresgongora [at] uma.es
+34 952133361


Boulevard Louis Pasteur, 35, Lab 2.3.6
HTS of Computer Science Engineering
29071 Malaga (Malaga) - SPAIN


Boulevard Louis Pasteur, 35, Lab 2.3.6
ETSI Informática-Telecomunicaciones
29071 Málaga (Málaga) - ESPAÑA


 

 


Google Scholar

 

Skills

ENGINEERING

LANGUAGES

COMPUTER SKILLS

CODING

 

Projects and research

 

A modular and self-arbitrated hardware architecture:
A new approach for electronic noses

The main limitation of electronic noses (e-noses), and any embedded system in general, lies in the versatility of their hardware. Adapting to a broad range of possible applications requires all sort of sensors, transducers, and processing capabilities. But carrying all imaginable hardware on a single device is usually not feasible in terms of size, weight, and cost. Also, for most applications, all functions are not required simultaneously at any given time.

We address this problem with a completely modular architecture based on self-arbitrated and self-contained modules; offering a solution to the design of different and specific devices (e.g. e-noses) with interconnectable building blocks. This brings not only versatility and reusability to the design, but also reduces development costs and ensures long-term serviceability, as new modules can be added as needed.

We have combined to that end MAVLink, a communications library of UAVs, with I²C, configured for multi-master broadcast communications: providing all modules with a single medium for communications on which they can seamlessly share all sorts of information or commands. Which also allows for the creation of distributed sensors that, connected over wireless modules, behave like a single monolithic device.

Notice that this technology is not limited to e-noses alone, but can be extrapolated to Electronic Control Units (ECUs) on modern cars, or to arbitrate distributed power generation and storage solution, among many other examples.

 

 

 

 

 

 

Integration of artificial olfaction in mobile robotics

Robotic olfaction is the discipline that brings together artificial olfaction and mobile robotics. It involves equipping robots with small and portable electronic-noses (e-noses) in order to extend their sensory capabilities with the perception of volatile substances. The relevance of these ”olfactory robots” lies not only in their usefulness for a wide range of potential odor-related applications, but also in the boost that the sense of smell brings towards the design of more intelligent robotic behaviors.

One of the most relevant tasks for olfactory robots is the automatic localization of a gas emission source (GSL). Traditionally, GSL has been tackled with autonomous mobile robots in an attempt to automate the search process. Different approaches, ranging from bio-inspired techniques to engineering solutions have been proposed. Yet, due to the still limited capabilities of autonomous robots and the complex mechanism that rule gas dispersion, most works in this field have only been validated under laboratory conditions (i.e. unidirectional and laminar wind fields, absence of obstacles in the environment, etc.) far from the complexity of real-world settings.

In this project, we are seeking to overcome these limitations by studying GSL from a teleoperation standpoint, so that we may understand how humans process the robot's sensory information to locate the source. In particular, we are interested in the development of a bio-inspired algorithm that may, one day, offer a solid solution to GSL in industrial applications.

 

 

 

Completely autonomous drones:
A proof of concept

Enhancement of a commercial multicopter for autonomous navigation. We equipped a 1.5Kg hexacopter with an UDOO-quad (4 core 1GHz Cortex A9) and an RGB-D camera. With them, the hexacopter was able to track and follow a color beacon at a constant distance by communicating the desired pose to the underlying APM Copter Controller.

We detect the beacon with OpenCV (through simple color segmentation) such that we can retrieve the depth of the associated pixels from the RBD-C camera; and to hold altitude, we employed a sonar on the bottom side of the drone. Once all distances are known, we run various parallel PID controllers on the UDOO and communicate the new, desired pose to the drone.

You may find our source code here and our published paper here.

 

 

Electronic shield for education

Educational electronic device aimed at teaching and training college-grade computer sciences and control engineering. Particularly, working with a microcontroller, digital intput-output, several standard industrial ports (e.g. SPI, I²C), analog signal processing, working with RC+RLC low pass filters, and controlling servo-motors among others.

We are currently working on the fourth iteration of our educational shield. So far, we have employed it to teach control theory, real-time operating systems, and other electronics related subjects.

 

 

 

Publications

 

 

comic by xkcd