Quanser Autonomous Vehicles Research Studio

Quanser Autonomous Vehicles Research Studio

A comprehensive teaching and research station with autonomous and collaborative robots. It consists of flying and mobile robots, a control and monitoring station, a motion capture system and safety equipment (nets, floor mats, etc.) 

 

Sample research topics: 

  • Control of a single flying and driving robot (manual and automatic mode), 
  • Vision of the location 
  • Communication between robots 
  • Cooperation between robots 
  • Mapping 
  • Navigation 
  • Multi-agent systems(swarm) 
  • And others 

Base station: 

  • PC with Windows 10 Professional operating system, 6-core Intel processor, 32 GB RAM, hard drives: 256GB SSD, 1TB HDD 
  • High performance GPU 
  • USB keyboard, USB optical mouse 
  • 3 monitors 24” 
  • QUARC- add-on to Matlab / Simulink. Note - Matlab / Simulink licenses not included in the package 
  • USB controller 
  • Wi-Fi router with built-in processor and 802.11ac standard support with a speed of 1,900Mbps 

Flying Robot: 

  • 4-rotor configuration (quadrotor), permissible load 300g, power supply: Li-Po 11V batteries with a capacity allowing for min. 10minutes of flight under max. load,  
  • Built-in programmable control platform Intel Aero Compute board, 64-bit 4-core processor, 4GB of RAM DDR3-1600, 8x PWM, 2x UART, 3x SPI, 4x ADC, 3x encoder input, 5x GPIO  
  • Main camera for orientation in the area: stereoscopic 3D Intel RealSense R200 (RGB camera, 2 IR vision sensors), depth of field of view 3m, resolution 1080p HD @ 30fps, VGA @ 60fps), additional camera for ground observation: VGA resolution @ 120fps 

Ground Robot: 

  • Programmable control platform with 2 driven wheels - Kobuki Mobile Base platform, diameter: 35 cm, height: 27 cm, permissible load 4.5 kg, power supply: Li-Po batteries with a capacity allowing for 2 hours of work under max. load, max. linear speed 0.7 m/s 
  • Built-in control platform: programmable with open architecture: Raspberry Pi 3B + (4-core Cortex-A53 1.4GHz processor) with integrated Wi-Fi communication module b / g / n / ac 
  • Microsoft Kinect RGBD motion sensor 
  • Other sensors: tactile crash (3x), distance (range from 0.5 to 4m), wheel drop (on 2 wheels), ground fault (on the housing), 3-axis gyroscope, wheel rotation angle encoders, color programmable LEDs (2x), IR proximity sensors for orientation in the environment 
  • Expansion inputs/outputs: 24 configurable I/O channels, ADC converters (2x), PWM (4x), encoder inputs (2x), UART, SPI, I2C, USB ports (2x), 1GB Ethernet, MIPI CSI camera port, port MIPI DSI for connecting a touch panel 
  • Environment orientation camera: VGA @ 30fps resolution  

Additional equipment: 

  • Motion capture system in the working space: Natural Point OptiTrack Flex 13, number of cameras depends of the size of the laboratory  
  • vertical networks securing the working space with mounting brackets 
  • Foam floor mat protecting the working area 
  • Battery chargers for drones and wheeled robots (1 charger per vehicle)