Compilation of Space Projects (40+)  we have made over the years! 

Strengthening Planetary Defense: Detecting Unknown Asteroids using Open Data, Math, and Python (2022)

www.MonitorMyPlanet.com 

Asteroid collision risks are real and unpredictable. 66 million years ago, the Chicxulub asteroid impact wiped away the dinosaurs. My project strengthens planetary defense by using robotic telescopes, open data, math, and Python to find unknown asteroids. I took images from 4 telescopes located on different latitudes to get full sky coverage.

I wrote Python algorithms to query the European Space Agency’s GAIA and NASA’s Horizon sky catalogues to find all known stars and asteroids. Mean, standard deviation and histograms were used to create masks to remove known objects. The remaining objects were classified as possible asteroid candidates. I detected 3 ‘preliminary’ asteroids. My algorithm’s plate-solving ability determined its Right Ascension and Declination using the focal length of the telescope and celestial location. I reported this information by creating a Minor Planet Center report for my images. I have made my code and methodology open-source to crowdsource planetary defence.


The Silence of Global Oceans: Underwater Acoustics Impact of the COVID-19 Lockdown (2022)

http://www.MonitorMyOcean.com

Yearly increases in global trade – 80% of which occurs through ocean routes mean that underwater noise levels in global oceans are rising. Unfortunately, the low-frequency noise from propellers and machinery of 60,000 container ships and ocean tankers overlap with frequencies marine mammals such as whales, dolphins, and orcas use to communicate and navigate. It leads to stress and increasing collision with ships. The COVID-19 pandemic led to a drop in marine traffic in waters of over 70% of countries because of the economic slowdown, reduced exports, and cancellation of ocean tourism season.

An opportunity was presented to measure the changes in underwater anthropogenic noise levels in global oceans using data from hydrophones (or underwater microphones) run by ocean observatories in the US, Canada, New Zealand Spain. A model was developed to standardize the measurement of anthropogenic noise levels. Twenty years of cumulative hydrophone data from seven oceanic regions were analyzed. The results have been presented as an interactive web application/


The Silence of Canadian Cities: The Seismology Impact of Covid19 Lockdown (2020)

https://hotpoprobot.com/2020/07/25/the-silence-of-canadian-cities-and-oceans-the-impact-of-covid19-lockdown/ 

Slide15

From the analysis, it was evident that the lockdown had an impact on seismic vibrations in almost all the cities I analyzed. In all the cities I analyzed, except Ottawa, the seismic vibrations decreased between 14% – 44% with the biggest decrease in Yellowknife in the North-Western Territories. In the 3 densely packed cities of a population over 1 million -Toronto, Montreal, and Calgary, the seismic vibrations dropped by over 30%. In the case of Ottawa (the capital of Canada), the seismic vibrations actually increased by 8%.


The Masked Scales: Measuring and Sonifying the Impact of COVID-19 into a Lockdown Musical (2020)

NASA SpaceApps 2020 Covid19 Challenge Global Winner

https://hotpoprobot.com/2020/06/19/the-masked-scales-sonification-of-impact-of-covid19-lockdown-toronto-lockdown-musical/

Winner

We built an instrument using Arduino, Sensors, and Camera to measure changes in Street Noise, Vehicular Traffic, Emissions and Light Intensity during COVID19 lockdown. We used Machine Learning (to count vehicles from video feed) and Python to assess changes happening during and after the COVID19 lockdown. We converted our Analysis into a Musical using 4 musical instruments (each representing a change in different variables): Marimba (Light), Vibraphone (Emission), Piano (Street Noise), Flute (COVID19 Infection Cases in Toronto). The tempo of Music was determined by changes in City Night Lights (using NASA VIIRS Data) and Vehicular traffic count before and after the lockdown.


Applying Machine Learning to Exoplanet Data from Space Telescopes (2020)

3 Ribbons Winner, 2020 Canada Online STEM Fair, Gold Medal, 2020 IRIC North American Science Fair

https://hotpoprobot.com/2020/06/07/machine-learning-exoplanets-stellar-noise-wins-3-ribbons-at-youth-science-canada-online-stem-fair-2020/

MIT

Applying machine learning to exoplanetary data may help remove the noise of star-spots in data on transiting exoplanets’ atmospheres received by space telescopes. I created a Hybrid Machine Learning model using Long-Short Term Memory (LSTM) – a form of Recurrent Neural Network (RNN) to reduce this noise. My model was able to accurately predict the exoplanet-star radius ratio in 55 wavelengths with a mean square error of 0.001.


ARTEMIS (Artificially intelligent Real-time Training by Environment, Mapping, Immersion, and Sounds) 2020

IEOM Conference 2020 First Prize Winner

https://hotpoprobot.com/2020/03/26/artemis-an-artificial-intelligent-robot-with-stereo-vision-can-hear-talk-learns-by-asking-why/

artemis

Our Artificially Intelligent robot that has stereo vision to identify objects and their depths, is able to hear sounds and identify them as familiar, unfamiliar or white noise, and can detect emotions. It uses local sensors data, logic and probability to identify its environment and make intelligent conversations.

It looks for human reinforcements to augment its logic. And it asks question when there is no human reinforcement or if its deduced logic does not match with the reality.


MY SCALE (to measure water quality) – North America Runners Up (2020)

Micro:Bit Do your Bit Challenge

https://hotpoprobot.com/2020/04/22/my-scale-for-measure-water-quality-is-north-america-runners-up-of-microbit-do-your-bit-challenge/

featured

MY SCALE” or (M)icro:bit for (Y)ouths: (S)alinity and (C)onductivity (A)nalysis in (L)ake (E)nvironment measures how easily current can pass through water. If the water is clean, then very little current would pass through. It is made of things found by kids in their pencil cases.


Dragon Fly Bot (2020)

Elle Hacks

https://hotpoprobot.com/2020/02/05/elle-hacks-2020-merging-biology-neurons-technology/

elle hacks3

Merging robotics with biology, neurons, and technology. The Robot is inspired by dragonfly which has 16 neurons and has the highest prey catch rate of 97% in the animal kingdom. It is efficient because of the arrangement of its 16 neurons which precisely control its wings based on the inputs it receives from its left and right eye.


Make A Robot Smile (MARS) Bot (2019)

Toronto Science Fair 2019

https://hotpoprobot.com/2019/04/07/make-a-robot-smile-mars-bot-using-artificial-intelligence-at-toronto-science-fair-2019/

The robot had a sad face to start with. As soon as it detects a happy face in front of it, it turns its face to show a happy face and also raises his arm to wave the Canada Flag! The Robot has 2 goals: use Artificial Intelligence to detect facial emotions (Happy or Sad) and then make the robot move depending on the face you make to the robot

Robot


Schools and NASA Aiding Climate Action by Kids (S.N.A.C.K) 2019

NASA Space Apps Toronto 2019

https://hotpoprobot.com/2020/01/04/my-satellite-data-and-climate-change-project-wins-spaceapps-toronto-2019-and-is-a-global-nominee/

The project uses satellite imagery to rank schools based on their tree densities and maps them. Students can add and share pictures of school trees to make their schools greener.

school snack


Facial Expression Recognizing Artificial Intelligence Bot (2019)

Ontario Science Centre, Tech Art Fair 

https://hotpoprobot.com/2019/02/24/machine-learning-algorithm-for-facial-emotion-recognizing-bot/

artash3

The Facial Expression Detection Robot is powered by Artificial Intelligence using Tensor Flow. It scans for human faces around it and turns a happy face when a happy face in front of it, and a sad face when the person in front of it is sad.

The robot uses a machine-learning algorithm we wrote in python with Tensorflow and Keras to read the emotion on the person’s face, either Happy / Sad / Angry / Surprised / Neutral.


Merging Music, Maths and Maryam Mirzakhani (2014 Fields Medal Winner) using Python Programming (2019)

https://hotpoprobot.com/2019/08/17/merging-music-maths-and-maryam-mirzakhani-2014-fields-medal-winner-using-python/

The work of Maryam Mirzakhani 2015 Fields Medal winner on pentagon billiards excited us, made us think, and challenged us.  We wanted to preserve the mathematics but express the work of Maryam in our own way using maths, arts, and music.

both


Deep Space Musical (2018)

NASA SpaceApps 2018 Toronto. Winner

https://hotpoprobot.com/2018/10/23/deep-space-musical-using-images-from-hubble-space-telescope-wins-nasa-spaceapps-2018-toronto/

Best Picture of PresentationWe transformed some of the iconic images of the Hubble Space Telescope into sound and created a Musical of our Universe from birth of stars (using Pillars of Creation image), to the formation of galaxies of stars (using Hubble Ultra Deep Field image), to the violent end of stars (SuperNova 1987A image).


Smart Street Navigation Using Artificial Intelligence for City Kids (S.N.A.C.K) (2018)

Elevate – TD Bank Hackathon Top 5 Winner (among 86 teams)

https://hotpoprobot.com/2018/09/30/top-5-td-elevate-hackathon-winner-smart-street-navigation-using-artificial-intelligence-for-city-kid-s-n-a-c-k-so-kids-can-take-safest-walking-path-to-scho/

boardThe goal of our S.N.A.C.K project (Smart Street Navigation Using Artificial Intelligence for City Kids) was to find the safest path for kids to walk to their school and other places. It uses data from 5 databases from Open Data Toronto and Vision Zero Toronto document. It includes a database of city schools and parks, bicycling lanes, enhanced pedestrian safety traffic signals, school crossing guards, and pedestrian fatalities.

The artificial intelligence algorithm uses a feed-forward neural network created in python to predict the risk index of all intersections in Toronto. The neural network’s 5 input nodes would receive data about the crossing and the output node would provide the risk index.


Model Rockets with Live Telemetry Data and Videos (2018)

https://hotpoprobot.com/2018/07/11/rocket-launches-and-telemetry/

IMG_1785

We love space and rockets. The ultimate thrill is making and launching your own rockets.

And fitting them with sensors and cameras to transmit live telemetry data and videos.

This is what we did!

More information


KiloNova Battle Bot (2018)

https://hotpoprobot.com/2018/06/08/thank-you-makerexpo-2018-the-maker-journey-continues/

arushi-lisa-bot
Arushi with Lisa Winter, BattleBot/RobotWars

Arushi built her 1 pound battlebot using motors, electronic speed controls, and a radio transmitter and receiver and named it KiloNova.

KiloNova made its debut at the Bot Brawl event held at MakerExpo 2018 in Kitchener.


cubeview

Research Balloon Mission-4: Space Radiation, CubeSats and Commerical Off-The-Shelf Technologies (2018)

Cubes in Space

https://hotpoprobot.com/2018/05/23/research-balloon-mission-4-cubes-in-space-space-radiation-cubesats-and-commerical-off-the-shelf-technologies-cots/

I am excited that my project proposal to study the impact of Space Radiation on electronic components for the CubeSats has been accepted to fly into NASA zero-pressure, high altitude, scientific balloon.

My project is going to measure the impact of space radiation on the components needed to build low-cost CubeSats.


Space-REX: Predict Risk Index of Asteroid Collision using Artificial Intelligence (2018)

Mission Hack Toronto

https://hotpoprobot.com/2018/03/06/space-rex-predict-risk-index-of-asteroid-collision-using-a-i-and-display-the-data-musically-in-10-steps-and-40-hours/

Display with glow

We decided to create a project which will use Artificial Intelligence / Neural Networks to predict the “Palermo Technical Impact Hazard Scale” for the collision of an asteroid with Earth. Our system would then convert the values of this scale into musical notes which would then be used to blink lights at the appropriate frequency on custom-designed hardware for displaying Asteroid data.

More information


Trappist-1 Model: Lights, Music, and Science (2018)

Ontario Science Centre, Toronto

https://hotpoprobot.com/2018/02/18/our-trappist-1-model-lights-music-and-science/

TRAPPIST Lights COver No Text

Trappist-1’s Musical System light and sound model where the motion of the planets are controlled by the music they generate and they all synchronize for a grand finale! It required making, learning musical notes, coding of midi files, Arduino, relays, led lights and a lot more: a very challenging project. So we had to do it!


train imageSolar-X Device (2017)

Frontpage coverage in Toronto Metro Newspaper

To take solar measurements and help us understand more about Solar Eclipses and Climate Change.

More information 1

More information 2


black hole modification

Black Hole Robot (2017)

TAVES Electronics Show, Maker Festival 2017

Our 30 lbs battle bot and a veteran of several robot battles!

More information


Arushi holding her Entry 2Yes I Can: Satellite Art (2017)

Winner of Canadian Space Agency SpaceApps Awards 2017

Canada 150 logo and other mosaics made using data gathered by the Canadian Satellite RadarSat-2.

More information


flagCanadarm 2 Model (2017)

March for Science Toronto

A model of the Canada Arm fitted with motorised jaws to pick up objects. Played a role in our TIFF Jump Cuts Finalist Movie: the Making of Canada.

Made its debut at the March for Science Toronto on 22 April 2017.


Cosmic_Dance

The Cosmic Dance (2017)

MakerFest 2017 Installation

The traditional Indian puppets are moved using Servos. Their movements are determined by signals from Space. We used a Geiger counter to capture alpha particles, beta particles and gamma rays which produced the signals.


IMG_0437

Z-Bot (2016)

Inspired by our Galactic Bot’s popularity, we created a more advanced version which could also study the spectrum of the Sun and project it on a computer.


indicatorsfashionzone1Gesture Sensing Helmet (2015)

Fashion Zone 2015 Winner

A gesture sensor was embedded on a bicycle helmet. Allows cyclists to give turn signals using gestures!


lighthouselabclimatechack6Fix the Six (2016)

Climathon 2016 First Prize and Climate Leadership Award 

To reduce climate emissions in the transport sector by 5%.


Global Nomination Picture

Artash Presenting Drop the DroughtDrop the Drought (2017)

Winner NASA Space Apps 2017 and Global Nominee

To predict droughts using NASA Satellite data from Landsat and Terra Modis in Kenya-Uganda Border.


spaceapps2M.A.R.S.-3

M.A.R.S (Maze solving Algorithm for Rovers and Space Applications) Path-Finding Rover (2016)

The rover has a camera and uses a maze-solving algorithm (using MatLab) to find a path around obstacles. Useful for autonomous rovers.


handathon2handathon13-D Printed Prosthetic Hand (2015)

Featured in the Toronto Star

First-ever “Handathon” to assemble prosthetic hands that will be sent to children who need them.


galacticbot-marsGalactic Bot (2015)

Hardware Hackathon 2015 winner

Made within 24 hours to learn more about temperature, color, age and composition of stars. It can recreate colors of different stars using RGB LEDs and is open source!


making1

final product

Roll-it, Roll-it Robot (2016)

Toronto Science Fair

A robot that rolls on the ground. Uses servo and gravity to propel itself forward.


making

final tashi

Artificial Plant (2016)

Toronto Science Fair

The robotic plant grows based on sensor inputs which determine gravity, moisture and light!


spaceapps2

Maze-solving Algorithm Rover (M.A.R.S)

NASA SpaceApps 2016

The rover avoids obstacles using the maze-solving algorithm. 


pengilearnsPengy: The Drawing Bot (2015)

Maker Fest Toronto

The robot uses two servos and Arduino to make drawings based on pre-programmed coordinates.

It makes use of inverse kinematics to produce the movements.


Model of Apollo 13./ Saturn V
Model of Apollo 13./ Saturn V

Apollo 11/Saturn V Working Model (2015)

NASA Space Apps

Using Arduinos, Sensors, Transreceivers, Motors and NASA audio files. We made a 30 LEDs Sequence Panel, Launch Pad 39A, Countdown Clock, Saturn V Rocket and Electric Bolts. Actions were automated to Countdown Clock (1) Retraction of Arms (2) Rocket going to Internal Power (3) Firing of Bolts (4) Rocket blasted using mechanical force (5) Rocket transmitting live telemetry data (Pitch, Roll, Yaw, Acceleration, Altitude) on display screen over wireless.

The project was inspired by our family’s visit to the Kennedy Space Centre’s Apollo 11 / Saturn V building in 2015 where we saw the Control Room for the launch along with the giant Sequence Panel, VAB, Launch Pad and the Saturn V rocket itself.


Toronto Playground (2014) TorontoPlayGround

Maker Fest Toronto

One of our very first projects to simulate children playing in Toronto parks. The park made was Stanley Park and it had see-saw, swings, merry go round, and water fountain switching on sequentially using an Arduino.


MakerFaire-Robot

Chris Hows-It-Feel Bot (2014)

Maker Fest Toronto

The robot moved around singing songs from Star Wars. It could move its head and hard arms to wave the Canadian flag. It could also record and display temperature and humidity data.