From the analysis, it was evident that lockdown had an impact on seismic vibrations in almost all the cities I analyzed. In all the cities I analyzed, except Ottawa, the seismic vibrations decreased between 14% – 44% with the biggest decrease in Yellowknife in the North-Western Territories. In the 3 densely packed cities of the population over 1 million -Toronto, Montreal, and Calgary, the seismic vibrations dropped by over 30%. In the case of Ottawa (the capital of Canada), the seismic vibrations actually increased by 8%.
The Masked Scales: Measuring and Sonifying the Impact of COVID19 into a Lockdown Musical (2020)
We built an instrument using Arduino, Sensors, and Camera to measure changes in Street Noise, Vehicular Traffic, Emissions and Light Intensity during COVID19 lockdown. We used Machine Learning (to count vehicles from video feed) and Python to assess changes happening during and after the COVID19 lockdown. We converted our Analysis into a Musical using 4 musical instruments (each representing a change in different variables): Marimba (Light), Vibraphone (Emission), Piano (Street Noise), Flute (COVID19 Infection Cases in Toronto). The tempo of Music was determined by changes in City Night Lights (using NASA VIIRS Data) and Vehicular traffic count before and after the lockdown.
Applying Machine Learning to Exoplanet Data from Space Telescopes (2020)
3 Ribbons Winner, 2020 Canada Online STEM Fair, Gold Medal, 2020 IRIC North American Science Fair
Applying machine learning to exoplanetary data may help remove the noise of star-spots in data on transiting exoplanets’ atmospheres received by space telescopes. I created a Hybrid Machine Learning model using Long-Short Term Memory (LSTM) – a form of Recurrent Neural Network (RNN) to reduce this noise. My model was able to accurately predict the exoplanet-star radius ratio in 55 wavelengths with a mean square error of 0.001.
ARTEMIS (Artificially intelligent Real-time Training by Environment, Mapping, Immersion, and Sounds) 2020
Our Artificially Intelligent robot that has stereo vision to identify objects and their depths, is able to hear sounds and identify them as familiar, unfamiliar or white noise, and can detect emotions. It uses local sensors data, logic and probability to identify its environment and make intelligent conversations.
It looks for human reinforcements to augment its logic. And it asks question when there is no human reinforcement or if its deduced logic does not match with the reality.
MY SCALE (to measure water quality) – North America Runners Up (2020)
MY SCALE” or (M)icro:bit for (Y)ouths: (S)alinity and (C)onductivity (A)nalysis in (L)ake (E)nvironment measures how easily current can pass through water. If the water is clean, then very little current would pass through. It is made of things found by kids in their pencil cases.
Merging robotics with biology, neurons, and technology. The Robot is inspired by dragonfly which has 16 neurons and has the highest prey catch rate of 97% in the animal kingdom. It is efficient because of the arrangement of its 16 neurons which precisely control its wings based on the inputs it receives from its left and right eye.
The robot had a sad face to start with. As soon as it detects a happy face in front of it, it turns its face to show a happy face and also raises his arm to wave the Canada Flag! The Robot has 2 goals: use Artificial Intelligence to detect facial emotions (Happy or Sad) and then make the robot move depending on the face you make to the robot
Schools and NASA Aiding Climate Action by Kids (S.N.A.C.K) 2019
The project uses satellite imagery to rank schools based on their tree densities and maps them. Students can add and share pictures of school trees to make their schools greener.
The Facial Expression Detection Robot is powered by Artificial Intelligence using Tensor Flow. It scans for human faces around it and turns a happy face when a happy face in front of it, and a sad face when the person in front of it is sad.
The robot uses a machine-learning algorithm we wrote in python with Tensorflow and Keras to read the emotion on the person’s face, either Happy / Sad / Angry / Surprised / Neutral.
Merging Music, Maths and Maryam Mirzakhani (2014 Fields Medal Winner) using Python Programming (2019)
The work of Maryam Mirzakhani 2015 Fields Medal winner on pentagon billiards excited us, made us think, and challenged us. We wanted to preserve the mathematics but express the work of Maryam in our own way using maths, arts, and music.
We transformed some of the iconic images of the Hubble Space Telescope into sound and created a Musical of our Universe from birth of stars (using Pillars of Creation image), to the formation of galaxies of stars (using Hubble Ultra Deep Field image), to the violent end of stars (SuperNova 1987A image).
Smart Street Navigation Using Artificial Intelligence for City Kids (S.N.A.C.K) (2018)
Elevate – TD Bank Hackathon Top 5 Winner (among 86 teams)
The goal of our S.N.A.C.K project (Smart Street Navigation Using Artificial Intelligence for City Kids) was to find the safest path for kids to walk to their school and other places. It uses data from 5 databases from Open Data Toronto and Vision Zero Toronto document. It includes a database of city schools and parks, bicycling lanes, enhanced pedestrian safety traffic signals, school crossing guards, and pedestrian fatalities.
The artificial intelligence algorithm uses a feed-forward neural network created in python to predict the risk index of all intersections in Toronto. The neural network’s 5 input nodes would receive data about the crossing and the output node would provide the risk index.
Model Rockets with Live Telemetry Data and Videos (2018)
I am excited that my project proposal to study the impact of Space Radiation on electronic components for the CubeSats has been accepted to fly into NASA zero-pressure, high altitude, scientific balloon.
My project is going to measure the impact of space radiation on the components needed to build low-cost CubeSats.
Space-REX: Predict Risk Index of Asteroid Collision using Artificial Intelligence (2018)
We decided to create a project which will use Artificial Intelligence / Neural Networks to predict the “Palermo Technical Impact Hazard Scale” for the collision of an asteroid with Earth. Our system would then convert the values of this scale into musical notes which would then be used to blink lights at the appropriate frequency on custom-designed hardware for displaying Asteroid data.
Trappist-1’s Musical System light and sound model where the motion of the planets are controlled by the music they generate and they all synchronize for a grand finale! It required making, learning musical notes, coding of midi files, Arduino, relays, led lights and a lot more: a very challenging project. So we had to do it!
Solar-X Device (2017)
Frontpage coverage in Toronto Metro Newspaper
To take solar measurements and help us understand more about Solar Eclipses and Climate Change.
A model of the Canada Arm fitted with motorised jaws to pick up objects. Played a role in our TIFF Jump Cuts Finalist Movie: the Making of Canada.
Made its debut at the March for Science Toronto on 22 April 2017.
The Cosmic Dance (2017)
MakerFest 2017 Installation
The traditional Indian puppets are moved using Servos. Their movements are determined by signals from Space. We used a Geiger counter to capture alpha particles, beta particles and gamma rays which produced the signals.
Z-Bot (2016)
Inspired by our Galactic Bot’s popularity, we created a more advanced version which could also study the spectrum of the Sun and project it on a computer.
Gesture Sensing Helmet (2015)
Fashion Zone 2015 Winner
A gesture sensor was embedded on a bicycle helmet. Allows cyclists to give turn signals using gestures!
Fix the Six (2016)
Climathon 2016 First Prize and Climate Leadership Award
To reduce climate emissions in the transport sector by 5%.
Drop the Drought (2017)
Winner NASA Space Apps 2017 and Global Nominee
To predict droughts using NASA Satellite data from Landsat and Terra Modis in Kenya-Uganda Border.
M.A.R.S (Maze solving Algorithm for Rovers and Space Applications) Path-Finding Rover (2016)
The rover has a camera and uses a maze-solving algorithm (using MatLab) to find a path around obstacles. Useful for autonomous rovers.
3-D Printed Prosthetic Hand (2015)
Featured in the Toronto Star
First-ever “Handathon” to assemble prosthetic hands that will be sent to children who need them.
Galactic Bot (2015)
Hardware Hackathon 2015 winner
Made within 24 hours to learn more about temperature, color, age and composition of stars. It can recreate colors of different stars using RGB LEDs and is open source!
Roll-it, Roll-it Robot (2016)
Toronto Science Fair
A robot that rolls on the ground. Uses servo and gravity to propel itself forward.
Artificial Plant (2016)
Toronto Science Fair
The robotic plant grows based on sensor inputs which determine gravity, moisture and light!
Maze-solving Algorithm Rover (M.A.R.S)
NASA SpaceApps 2016
The rover avoids obstacles using the maze-solving algorithm.
Pengy: The Drawing Bot (2015)
Maker Fest Toronto
The robot uses two servos and Arduino to make drawings based on pre-programmed coordinates.
It makes use of inverse kinematics to produce the movements.
Model of Apollo 13./ Saturn V
Apollo 11/Saturn V Working Model (2015)
NASA Space Apps
Using Arduinos, Sensors, Transreceivers, Motors and NASA audio files. We made a 30 LEDs Sequence Panel, Launch Pad 39A, Countdown Clock, Saturn V Rocket and Electric Bolts. Actions were automated to Countdown Clock (1) Retraction of Arms (2) Rocket going to Internal Power (3) Firing of Bolts (4) Rocket blasted using mechanical force (5) Rocket transmitting live telemetry data (Pitch, Roll, Yaw, Acceleration, Altitude) on display screen over wireless.
The project was inspired by our family’s visit to the Kennedy Space Centre’s Apollo 11 / Saturn V building in 2015 where we saw the Control Room for the launch along with the giant Sequence Panel, VAB, Launch Pad and the Saturn V rocket itself.
Toronto Playground (2014)
Maker Fest Toronto
One of our very first projects to simulate children playing in Toronto parks. The park made was Stanley Park and it had see-saw, swings, merry go round, and water fountain switching on sequentially using an Arduino.
Chris Hows-It-Feel Bot (2014)
Maker Fest Toronto
The robot moved around singing songs from Star Wars. It could move its head and hard arms to wave the Canadian flag. It could also record and display temperature and humidity data.
Winners
Best of the Fair Award and Gold Medal, Canada Wide Science Fair 2022. RISE 100 Global Winner, Silver Medal, International Science and Engineering Fair 2022, Gold Medal, Canada Wide Science Fair 2021, NASA SpaceApps Global 2020, Gold Medalist – IRIC North American Science Fair 2020, BMT Global Home STEM Challenge 2020. Micro:bit Challenge North America Runners Up 2020. NASA SpaceApps Toronto 2019, 2018, 2017, 2014. Imagining the Skies Award 2019. Jesse Ketchum Astronomy Award 2018. Hon. Mention at 2019 NASA Planetary Defense Conference. Emerald Code Grand Prize 2018. Canadian Space Apps 2017.