The Project Proposal
- Objectives :
The future calls for a safer, more efficient, and
reliable habitable environment for everyone throughout the globe - an
attack, accident, and disaster-proof infrastructure with zero casualties
in such events. But if there are any, we need to have the technology
and resources to combat such situations. SARDA is a deployable robotic
unit intended to help first responders in such crucial situations to
carry out search and rescue operations in dangerous environments.
When
deployed into the disaster site, it helps to map and render a dynamic
color coded 3D model of it's immediate surrounding along with data like
temperature and humidity ; Which, not only aids the rescue team to
better navigate the terrain but also informs us in case of fire and
smoke along with it's exact location.
This digital twin can also be overridden with manual control, providing us with an efficient two-way communication system.
The Deploy-able unit :
The visual display unit illustration ( Artwork ) :
We,
along with SARDA, aim to change the way these operations are approached
in order to create a more healthy and safer working environment for
rescue personnel and shape a better and more secure future ahead.
- Result :
The
system consists of three different units that work together in harmony.
The Deploy-able locomotive unit (DLU), which is automated, has the
capability to path-find and traverse the terrain while scanning and
mapping its surroundings. The captured data is sent to a receiver and
controller unit (RCU), which further processes it and sends it to the
attached computer, which has the virtual assistant installed. This
renders the geometry and displays all the other data intelligently to
the user. The dynamic virtual assistant contains the digital twin of the
deploy-able unit and thus can be overridden and controlled passively.
This makes the RCU capable of two-way communication. The separate
receiver unit packs the system with versatility where it can be used
with any devices be it an android, windows or Linux - Irrespective of
the hardware, working alongside each other.
Screenshot of the working visual unit, in unreal engine :
Prototype 1 : March 2022

Prototype 2 : September 2022
- Conclusion :
SARDA's dynamic nature and versatility allows varied customization and enables
building
around it's basic architecture in order to customise it's functionality
in accordance with the user's needs. This opens a doorway to endless
possibilities where it can not only be used to help civilians, but also
for military and during fire extinguishing tasks.
For example, If a
site is damaged by an earthquake or other causes, the deployable unit
can take a snapshot, record a video or simply map the site.
A variety
of other sub units that can be developed and used with the deployable
unit to measure orientation, temperature, humidity, air quality and many
more.
The cost of developing such system is very low. Thus it can
be easily provided for home security or customized as a military
framework for surveillance security.
The poster
Presenting SARDA Bot at the National Exhibition September, 2022
Me and Sayanti Chatterjee Presenting the SARDA prototype at the 75th National Exhibition at the Science Fair Saltlake. On September 2022.
Here are some videos
Here is our the Development Blog !
Death, war and destruction. Plagued humanity for the past millions of years.
It was time to do something.
Or, atleast that's what the excuse was to make this project !
We
wanted to make something big and to showcase what we can actually make.
After making all the previous, "mini" projects, I personally wanted
something mobile, something complex and revolutionary. Which would bring
all my past project experience and knowledge into one place giving
birth to something ultimate. Which would not only be the best think to
look at and present to people but also a technical feat for us.
So,
yah. We decided to make a automated mobile robot that can pathfind and
3D scan entire areas. Can also give us information like temperature,
humidity and smoke levels etc. Which would be displayed eloquently in a
software that we would develop.
Which, not only aids the rescue team to better navigate the terrain but abo informs us in case of fire and smoke
along
with it's exact location. This digital twin can also be overridden with
manual control, providing us with an efficient two-way communication
system.
We, along with SARDA, aim to change the way these rescue
operations are approached in order to create a more healthy and safer
working environment for rescue personnel and shape a better and more
secure future ahead.
Now, this whole system consists of three
different units that work together in harmony. The Deployable locomotive
unit (DLU), which is automated, has the capability to path-find and
traverse the terrain while scanning and mapping its surroundings. The
captured data is sent to a receiver and controller unit (RCU), which
further processes it and sends it to the attached computer, which has
the virtual assistant installed. This renders the geometry and displays
all the other data intelligently to the user. The dynamic virtual
assistant contains the digital twin of the deploy-able unit and thus can
be overridden and controlled passively. This makes the RCU capable of
two-way communication. The separate receiver unit packs the system with
versatility where it can be used with any devices be it an android,
windows or Linux - Irrespective of the hardware, working alongside each
other.
Arduino Nano
Magnetometer
Motor Driver
NRF Module On-Board
NRF Module on the Ground Module
Gas / Smoke Sensor
Temperature Sensor
Ultrasonic Sensor
With all these in mind, we started working on the project.
It was day 1 • 6th April, 2022
Where we bought all the necessary meterials and stuffs needed to build the project. And hey, we tried to use an Arduino nano making it the first time ever trying out one of these bad boys. All these was not enough but surely it gave us a good starting point for the build.
We would make everything from scratch. Even the body of the robot. I would use my classic lamination fibre and that good old knife as I did in my previous projects to make the chassis of robot.
And ufcourse, I had my beloved Sayanti working with me in the Neotia University Robotics lab.
We started by marking and prepping the lamination fibers in order to cut them into shapes !
Now, before we could even get started. We realised that 4 of the custom DC motor armatures we bought, which would house the wheels of the car and help drive all of them individually, didn't have wires or jumper cables attached to them and even though it was so obvious, it ment that we had to solder in our own wires. Attach and prep them properly at first before we can even test them out.
This was problematic for us because, uhh. We must confess that it's the first time we're picking up the soldering iron. Kids with big dream we are huh, you know what we mean. There are first time to everything !
This was a great opportunity to learn how to do a good ol' soldering. We stripped and prepared the jumper cables that we had in order to weild them to the motors and got the soldering equipments, including the paste and everything. It took a few blisters, some YouTube videos and a series of faliure to get things up and running. And right when we thought things we done. The wires would come off. It took a lot of fiddling around to get in a stable position. We even used hotglue to secure the wires in place and prevent short circuit on top of the soldering.
We were using the L239D motor driver which was also new to us. So, we had to look up datasheets and articles on the web in order to learn about all of it's features and how we can use it to its fullest potential.
Once we got it to work, we were finally able to attach the DC motors that we preped by soldering together and test it out whether they worked.
But before we could move forward, another problem befell us. And it's because we couldn't find any datasheet or information on the operating voltage of the motor. Now, that wasn't a dead end because we decided that we'll go with 7.4 volt instead. That's two of our Li-ion batteries in a 3 slot battery holder. So we needed to connect two of the terminals together in order to skip one slot and make it work with two batteries instead of three.
Let's just say, things didn't go well. Occasional disasters here and there due to short circuit and wrong polarities. But yah, we brought in a multimeter and made it work eventually.
With that done, we hooked up the DC motor and it worked perfectly as we started towards the spinning wheels.
In order to not mess things up, we made a rough 3d model of the whole robot, here the locomotive unit inside of 3ds Max with real life measurements to figure out what the optimal size of different parts should be. And if we had to add something, we'd just look up the dimensions digitally and weather it would fit properly.
So, yah after we've decided the structure and the measurement of the car, we used out good old carbon knife to cut the fibre into the needed pieces.
Now, we were way past the amature mistakes. We knew what to do, so the next step was to attach the different circuits and equipments on the board before we take care of the motors.
So we have to take in consideration the battery holder, the motor driver and other equipments that would go on the board. So, we measured all the different components and decided how much space we needed for it. Used 3Ds Max's model as a reference and drew the line onto the board, where we needed to get holes for the screws instead of just sticking them with glue, which would may come off with harsh usage !
So, yah. We spent a lot of time into the measurement aspect of the build. And marked the fibres with the carbon knife. We overlayed the different modules onto the fibre and marked the screw holes with the pointy instrument by deeply poking holes onto it. And yes, we used a marker as well.
Cutting the fibres physically was an extremely labour intensive job. Since we had to put massive pressure onto the plate with the carbon knife and keep making incisions until it comes apart. Later we needed to file the edges in order to make it smooth.
This whole process was to be repeated again and again until we could carve out the required piece from out of the whole fibre board.
Day 2 • April 8th, 2022
Today, we're taking up the drill machine. We've never done it before so it was a first time kind'a thing. But we were willing to learn. So, after the main chasis piece was ready and cut out properly - we made sure the markings where we wanted the screw hole to be were in the right position and brought in the lab's drill machine. A smaller drill lead would've been better but the one we had would also get the work done. So yah, with a little trial and error we got it working.
It was such a fragile piece of meterial, so we had to be careful while making the holes and of the table underneath it. We dont want to puch tunnels on the lab table, do we now ?
After the holes were done, we could easily attach the battery holder and the driver module.
The next step was to attach the dc motors onto the board. And it was tricky since there was no way to screw it on ! So, we took a file and brushed off some section on the laminated fire. I'm talking about giving it a rough texture. Thus would give us the ability to hotglue the motors in-place. Because it was so smooth earlier, it wouldn't work otherwise.
Now, we didn't get much of an assurance in that department because with rough usage, it may come off. Not to mention the heat these motors would generate so, we took some zipties and tied the whole thing in place on top of the hotglue, to make it extra secure !
It took some time to get everything in place. And after fiddling around for an hour or so, it was finally ready and good to go !
Looking at something like this come to form is so satisfying. And it felt really good, we made a car from scratch ! Even the chasis.
Day 3 • 9th April, 2022
So, our car had it's form at this point. But, it had no engine ! So, we needed to put one in it.
Now, as we decided earlier, we wanted to power it with an Arduino nano due to space and weight constraints. So, I bought that in !
Meanwhile, we took a bold decision and decided to make everything on mini breadboards and with prototyping cables. This would let us easily change things if something went wrong and it'd also be convient for upgrades. Everything aside, it was actually a prototype so, why not use prototyping board !
So, yah. I made all the connection between the motor driver and the Arduino and wanted to try it out ! That ment, it was time to code.
Now, we knew the code was going to be huge!! All these different systems interacting with each other and working together. Not to mention me and Sayanti had to share code with each other and we needed a unified workspace.
So, we turned to the infamous GitHub where we both created our own shared repository and got to work.
We already interfaced the motor driver back in the lab to test it. So, we just needed a fresh and optimised version of the code rewritten to work with the full setup. So, it's GitHub! Sayanti wrote me the code and I brought it in and tweaked it a bit to match the scenario.
This little code was for testing out whether the wheels worked and the motor driver functioned as intended and it did ! We're really happy about it.
Day 4 • 18th April, 2022
But now, it was time to move an inch closer towards the final structure of the robot.
Here's the plan ! Let's have two breadboards. One on either side ! Housing the Arduino nano in one and the components like the magnetometer, the DHT, the smoke sensor, etc in the other.
We'll bring power back from the motor driver, then we can distribute it here in the front. We'll have lot's of cables to do that !
So, we got these mini breadboards, which are absolutely my favorite, so cute. And attached it in their designated positions, on both sides.
We already interfaced the Arduino with the motor driver, so we just needed to fill up the other breadboard.
I started by adding a little buzzer and connected it with the Arduino after working out the power supply. Next in line, we have the temperature sensor and the magnetometer.
Day 5 • 22 April, 2022
It was college fest !
But in the morning, we're busy interfacing and testing out the magnetometer in the robotics lab.
And the reason we had to use a magnetometer in this project because...... As we know, we planned to have a digital twin of this scanner unit, in our mapping software.
Now, a digital twin is a digital copy or instance of a real world object, wherever it moves, rotates or scans ! We'll be able to see it in real time updating in our visual display unit.
So, in order to track the orientation and the state of our scanner unit, we had to incorporate the magnetometer!
Now, getting the code to work on something like this was really tough. So, it needed some extra effort before it finally worked as intended.
And now, finally we could track the orientation and can see it updating on the screen. That was so satisfying. But little did we know, getting it to work with the whole thing was a whole different story.
Day 6 • 23 April, 2022
We already had the code for the temperature sensor up and running, so the only thing that was left, was to interface and connect the DHT onboard. So, I did exactly that and tested it to make sure it was working fine !
But now it was time to interface the magnetometer which we wrote the code of, back in the lab. Since we already knew what connection worked best and already had the clean code, because of our rigorous testing back in the lab, we now could just fuse it with our current state and move on.
The magnetometer lay horizontally on the breadboard when we put it in, which is actually good ! Just had to make sure it stays exactly leveled with the ground to avoid any mis calculations.
Day 7 • 24th April, 2022
As mentioned earlier, our deployable unit, which is our scanner unit with wheels would scan the area, record the datas and send it wirelessly to the receiver module with will be sitting still and would act as a recieving and decoding station for the visual display unit or the software !
Thus, we needed to get the wireless aspect of the project sorted out !
To enable wireless communication, we decided to use the NRF radio communication module, which were really tiny and used SPI communication.
So, without any further adeau, we got to work and quickly interfaced the module with the onboard nano using long wires, since I had plans to attach it at the approximate middle of the robot somewhere!
After writing the code, we cleaned it and optimised it up a bit and used my Arduino Uno to connect the second nrf module ! So that we can communicate with each other and test out whether it even works !
The connection was a bit unstable, and things acted a bit unrealiable and even after fiddling around with the datasheets, it wasn't much clear to us.
And... I may have fried the circuits... Totally forgot it was a max 3.7v Little module.
Anyways, in order to take a break away from all these, I continued work on a different aspect of the project ! And that's the motor speed and duration calibration.
You see, I needed the uit to turn approx 90° when the turn command is given. And ij order to ensure that precise control, it needed to be calibrated !
Luckily, we have the on board magnetometer, which can tell us exactly how much the vehicle actually turns !
So, we took numerous amounts of readings multiple time. The unit would be commanded to rotate 90° through a internal calibration routine made by us towards a specific direction where the Arduino sends a specific signal to the motor driver, repeatedly and record the actual rotation that the vehicle undergoes.
These rotational movement happened continuously and the recorded data which are the expected movement and the actual movement get's sent through serial communication via the usb cable.
Through different experimentations, we noticed that different rotational angle, decending uniformly gave the best amount of varience needed to formulate the equation that we wanted to. Thus, it try to rotate a different amount every iteration !
This reading is taken again and again for numerous iterations and recorded for calculation !
We then take these readings and bring them into Excel! Where, the best method we found was to plot all the avarage values and their relations and extract an equation out of it, which we can later use to calculate the exact motor speed and duration it needs to turned on in order to rotate x amount !
And now, our vehicle could actually traverse and rotate in a precise manner ! Any further imperfections won't even matter because we'd be able to detect the orientation in real-time using the magnetometer, but it sure didn't go haywire while rotating.
Day 8 • 27th April, 2022
A fresh day, a fresh start, it was time to finish what we started. I dug deep into the wireless connectivity problem ! I fixed the wires, thought that's what's wrong. Dug deep into the code and the workings of the NRF and tried to operate it at an extremely high power to ensure proper reliably. I even ordered a better, stronger version of the nrf module since I doubted the old one got fried.
Since, I found the module to be so sensitive to small voltage fluctuations, I gave it a little stability using a capacitor and it seemed to hold on pretty well !
With this done, I improved the wireing a bit and all I could do is wait to get the other components delivered !
Day 9 • 30th April, 2022
It's always fun to unbox new stuffs ! Apart from a big metal gear servo to house the scanning head, the variations in zipties would help to secure the wired perfectly in place ! Not to mention the Multimeter that I can use to further troubleshoot the problems i was facing !
And, oh ! I almost missed the smoke sensor. It was a crucial component for the project, for obvious reasons.
The new nrf module looked cool like a wifi antenna. But before I could install that, I needed to make sure the whole setup worked as is at the moment.
But I needed to have a formal way of communicating between the software and the ground module via the receiver or the broadcasting groud module !
So, in order to solve the problem, We needed to come up with a language of communication. A programing language that SARDA would understand when spoken to with !
Thus we created the SARDA programing language, with which we could communicate and control the scanner unit vehicle ! Where the ground unit or one can say, the 'receiver unit' acts as a compiler. With compiles the program code and sends the machine code via radio communication so that it can understand and act in accordance with it.
With this simple decision, we have taken platform independencey to a whole another level. We can have any OS running our software parallely, and can communicate and control the whole setup as a node with zero hassle !
Before, we move on, I have to mention that I integrated the smoke sensor as well, on the 2nd breadboard, where it fit beautifully and when need arises will give us informations like the amount of smoke a hypothetical fire is causing, whether it's harmful for humans to breathe and more !
Day 10 • 1st May, 2022
With the new nrf module, a new ground receiver was long due !
So, I brought some of those lamination fibres and started working.
First of all I measured and calculated how big and in what shape I wanted my ground receiver to be. After doing that, I used a pencil and the carbon knife to make the markings which was followed by the traditional way of cutting such materials !
I used the carbon knife to cut the lamination fibre in shape precisely and even got some in small pieces in order to make a small little tower to house and support the antenna of the new nrf module. This would make it look like a Wi-Fi router.
When I was finished with everything I joined all the little pieces together with industrial glue and attached the nrf module with the antenna pointing up in the sky.
With this done, it looked just like DIY homemade Wi-Fi router and I was really happy with it.
I used my arduino UNO to power the receiver and the transmission process because it was big and there was lots of space in the receiver unit for the UNO.
I used the jumper wires to connect everything, upload the code and test out to make sure whether it worked as well as the previous installation.
So, now with the stationary groud communication module up and running, which looks surprisingly good, by the way ! It was time to fire up unreal engine, in order ro develop our software, A.K.A our, visual display unit.
Now, at this point we had our magnetometer, our wireless communication and the compiler or the ground communication module up and running. We just needed to be able to send the commands not by hand typing them, that we were doing earlier but through the software we develop.
So, I started working on the unreal project. Before everything, I needed to set up the codes for serial communication between the software and the ground communication unit !
Here's the plan......
We send a command in our own programing language to the ground communication unit that we just made.
let's say for an example, a request to get back the current vehicle orientation !
So, we send the code through serial to the Arduino uno, which then compiles and sends the respective commands to the scanner unit through radio communication.
Next step, the scanner unit sends back the magnetometer reading, which is received and processed by the ground communication unit through radio and then is sent back to unreal engine through serial.
Now, those of you don't know, I already did something similar in my previous project, "Armball mania" ! Check that out if you haven't yet btw. A better and more improved version is under the works !
So, I ported that project and modified the code to work with the current one. And for reference I had a simple 'cone' at the current moment, just as a placeholder to check whether it worked and updated the orientation in real time on the screen as we rotate the scanner unit by hand.
Here are the results !
It turned out pretty well, but ufcourse I'm planning to change stuffs around and make it better. I have to overhaul the whole system to, as I said previously - ' more reliable '
For the future, I had plans to wrap the data exchange in a JSON package which would make it a lot easier later to extract and exchange multiple informations with ease.
But that was a problem for another day !
Day 11 • 3rd May, 2022
This was a big day. I was super excited to get done with the hardware part of the project. I already had the best of a servo and two of the ultrasonic sensors ! One was mine, one was Sayanti's ! So, before we could make the whole scanner contraption with two of the ultrasonic as sensors, I needed to get the servo fixed ! And oh, If I didn't mention it before, I was planning to incorporate two ultrasonic sensors on rhe scanner head, each facing the opposite direction ! This would allow us to scan all the for direction with one 90° motion !
So, I tested the servo and hooked it up with the main power source since this baby needed lot's of juice ! The servo needed to be calibrated in order for me to find the zero position that it'll rest it. So, after a little more test, here and there. I hotglued and attached the servo with the chassis !
This was not enough, and it didn't even look good ! So, in order to support the servo, I used my good old lamination fibre to cut out some carefully designed pieces and atteched them on the either side of the servo. This would support the servo and hold it in it's position.
We needed another of such structure to hold the two ultrasonic sensors in place. So, I got to working and designed a 'U' shaped contraption that would hold the ultrasonic sensor in it's place. Which I attached and screwed in with the servo before I sealed it off with another thin supporting bar !
Getting the ultrasonic sensors fixed in their place was a messy job as well, since I only relied on hot glue to hold it together as it solidifies! But somehow with all the strain and pressure it kept crumbling down. Thus, in order to fix it, I needed to install another supporting think wafer to push both of them apart and keep them from crumbling in.
After fixing the structure with the servo, I just had to connect the wires flowing from two of the ultrasonic sensors to the Arduino Arduino! Which wasn't as easy as the others because the wires had to be of optimal length and specific tension which would allow the whole contraption to rotate freely and not put much stress on the servo, but It should be hanging with enough tension so that it doesn't fall and block the view or get windled up with the whole thing. Nothing a few zipties won't fix, really.
Day 12 • 6th May, 2022
.....But, something weird happened.....
While testing the magnetometer, it seem to act really strange. When the sensor is rotated in a constant motion, the rate at with the values change accelerate and decelerate in a vaguely unpredictable way. In other words, it was acting differently and was not giving the values expected.
So, I used a mapping program to randomly recorded some values to better look at what was going on as I rotated the sensor in a spherical manner !
And I got what I feared the most - all the mapped values was squed away from the origin where it should've been.
Maybe because the magnetic fields of the surrounding components interfered with the sensor ? Or maybe something else. But we knew how to fix it ! We needed a calliberarion function - with we calculated with a handy utility !
After properly so called 'calliberating' the sensor by running it through our fix function which ran the old values through a calibration matrix, all of the strangeness was fixed in a blink of an eye. Now, the compass or the magnetometer was working as intended, perfectly even when it's on-board !
Apart from all these hasstle, I spent lots and lots of time managing the wire properly ! Tying them up with zipties, so no amount of stress or tension way cause loose connection or disrupt the proper functioning of the whole system ! Since, this was a project whole-ly on a prototyping board, this was crucial. Not to mention the matching yellow tapes placed to make it look better also for added support.
I also took some time to add a fully functioning joystick and a mode switch button ! Why the joystick? We'll as I stated earlier, we would be able to control the scanner unit directly from the visual display unit or the software but still, incase we needed a manual override, we could just switch the mode and gain control of the vehicle with the joystick! Thus, I soldered the switch with a little help from my father, placed the joystick as well us buzzers and led lights to better indicate the status of the whole process also to help debug the whole thing if need be !
Day 13 • 8th May, 2022
With, the hardware part almost done, it was time to focus on the software part, since that was the most unique part of our project ! The software interface the we would provide, will be extremely easy to use and modern in term of UI and UX.
Not to mention, the way we will represent the data, would be the most fantastic ascpect of this project. That's what the goal was !
Thus I continued from where I left off previously and started working bit by bit towards the bigger picture ! At first, I needed to get the JSON done !
Serialising and de-serialising JSON strings was a tricky job ! we needed to serialise JSON strings as I said earlier back in the Arduino Nano - of the vehicle, which means we needed to pack all the necessary details and make a JSON string so that we can transfer it wirelessly to the compiler or the ground module which in turn sends the Jason string through serial com to the computer which is the software !
Bit that's not all, we need to convert the JSON string back into a structure in other words, we need to deserialise the Jason string and extract all the information from it !
For that to happen, will needed a function which would handle the extraction in easier a easier way as well as be fast in doing so ! That is why at first I got to work in order to build such a system.
Once done, I could easily check it in our programming console window since we build a different programming language for our project itself !
we can send a comment and the vehicle would automatically perform all the motion and everything necessary to scan and collect all the data at its present location, build the Jason string and send it to the computer in order for the software to represent it !
The respresesntation that I needed to work with before anything else was the pillers !
As I said earlier, the pillers are the unitary 3D structures that the software uses to build the 3D environment ! The more close it is to the scanner, the bigger and the farther away, the smaller !
To construct such structure, our digital twin on each tick requests a JSON package with all the necessary environment details to build a pillar. It creates a new instance of our unique pillar object and calls a custom constructor where it sends all the informations that it unpackaged from the JSON package ! This constructor handles all the rendering and animation work itself, which I'm not going to go much detail into, at this moment .... because we still have work to do, we needed build a world... a sandbox!
Day 14 • 22th May, 2022
This is my favourite part today !
I am going to work on the visual aspect of the software.
Now I absolutely love came development and game design ! Apart from that I have made several 3D art-works previously, so this is the part which I enjoy the most.
This was a technical software and most technical softwares are boring looking but not us, we were going to make it modern and beautiful looking as my other games so for the look I wanted this - 'really pure, white and well lit and empty space with grid lines on the floor where all the pillars and our digital twin would cast really soft shadows and it would give us that soft, well lit professional look, that we we're going towards !
Most software I have seen has really old fashion 3D renders, but we are going to make the shadows and the lighting super accurate and super smooth and the way were going to achieve it is to implemented real time ray-tracing into the project !
I am talking about fully raytraced shadows and fully raytraced lighting along with 4K textures and high-polygon models !
Got to work and it was a long long journey...
I desiged the modles in 3D studio Max and brought it into unreal engine in order to texture it and animate it !
Now the environment, it took a lot of time. I experimented with different parameters of the directional light in order to get that soft look that I was looking for.
The grid patterns was custom made in Photoshop so we could change it anytime and I did go through different variations throughout the time I was working on it, in order to get that look which I was going for !
Day 15 • 29th May, 2022
Once the environment was done we needed some way to control the scene !
The user needs to we able to rotate and look at the scenes from any point of view ! They can even zoom in, if they want to.
and oh,
if the leave the mouse button the camera will come snapping back into the position where it is best fitted to view the digital twin from, which it is pared to.
This would kind of convert our software into a full fledged 3D software because the controls where exactly like a 3D software say it autocad, 3D studio Max or Maya where we had takes the inspiration from.
Now, I had never worked on such intense camera controls before so it was a lot of experimentations that needed to be done !
I am talking about nights after nights of working long stretch hours in order to learn and experiment with the camera.
I am not going to go into the details of what happened because like it was a long long journey.
Day 16 • 30th May, 2022
I went all an in with animations!
Now, just like the camera, animation was not my strong point so I had to learn a lot of different new things in order to pull it off !
I won't go into much details, but for the animation I wanted the pillars to appear from nothing as it would scale itself up in a really unique and animated way which would look will really cool ! So yes, I worked on it so much to get the animation that I had in my head.
Not to forget the colour changing feature of the software also falls under animation ! what the feature actually is, that when a pillar is drawn or animated up the colour which it emits from its edges - which is pure white initially, changes over a time into red to showcase that enough time had passed in order for it to be an outdated reading !
Yes, I meant it when I said our software was going to have some really unique presentation techniques !
Day 17 • 31st May, 2022
After everything had been done,
I needed to change the way our digital twin was represented ! it was really old fashioned and outdated because it was actually a place holder for what we would be creating later and it's already time to make that happen !
So, I fired up 3D studio Max and designed a new one.
This was going to be the main centre of attention so it needed it to be elegant and good looking !
After coming up with the proper design I brought it back into unreal, textured and made it a bit illuminatives to proper match the feel of the environment !
After I was finished, it looked really good, matched with the colour scheme and thus, we we were good to go !
At the end and I brought in a post processing volume in order to add that sweet touch of colour correction and colour grading to give it that spicy look !
I actually made the lookuptable in Photoshop from scratch only for this project in order to achieve the custom look that I was looking for !
Day 18 • 2nd June, 2022
Everything had been done at this point, it was now the time to add the buttons and the way for the uses to control the software ! Also the heads up display, which would display the informations about our digital twin on screen, like the temperature, smoke, humidity and the orientation data !
I wanted all the environmental data to be displayed on the left on floating islands and the orientation data, in degrees at the top !
Not to mention all these datas are also, as I stated previously, displayed on a floating hud ui on top, attached to our digital twin so the user doesn't have to take their eye off it, ever ! That's called a juicy ux design ;)
So, as usual I fired up Adobe XD in order to design the user interface !
Day 19 • 8th August, 2022
It was early august, I was working on the project. Perfecting and improving the way the whole system worked and yes, working on the visual and technical aspect of the software as well.....
When, I recieved a call !
We were invited to the national since fair exhibition at the end of this month, where were to present our projects in the university's booth ! I mean, this was a huge deal for us !
So, with the strict deadline of almost a couple of weeks, we began rigorously testing the bot, since almost everything was done at this point!
Just to summarise,
When the robot, the scanner unit is deployed onto the distaster site, any place that had heen plagued with earthquake, fire, etc ! It automatically traverses through the terrain, scanning and gathering information about it's surroundings. After each step, it bundles the scanned data, along with information like the smoke level and the temperature, humidity data which we measure with the help of the gas sensor and the temperature sensor respectively into JSON file. Then send it through radio communication to the stationary communication module ! It works as a middle man or a translator for the computer it's connected to ! Thus, when it receives the json file, it sends it to the software we developed using serial communication ! The software in-turn deserialises it and plots it in the screen for the user to see !
As I said earlier, The user rotate the scene and view the information or the scanned 3D scructures from different angles and can even export it for later use !
Not to mention, the scanner unit being a digital twin can be controlled directly from the software using the manual override mode, where the view switches to a game like camera and allows the user to drive the digital twin along with the physical unit just like a video game ! Isn't that cool !
Here's a little tech demo, where the scanner unit pathfinds through my room and beautifully maps everything ! Where the height of the pillars inversely denotes how far the scanner unit was when that specific piller was mapped and the dynamic colored lights on the piller changes to denote how old the mapped piller really is ! Not to mention the temperature, humidity and smoke data written on top of each pillers ! Ain't that a unique representation ?
So, yah ! That was the project. It was time to brush everything up for the last time, and wait for the big day..... Until it finally came !
Day 20 • 27th August, 2022
It was, the day ! Not gonna lie, we were somewhat nurvous since it was kind of a big thing for us - But we did our best to give them a show :)
Apart from SARDA, we also presented our previous project ! i.e. 'Amball mania'..... and the crowd was insane!
We had the booth to ourselves for the day, where we had all the working units of SARDA displayed clearly in front of us with full power throughout the day ! Even the visual unit was working on the laptop!!
The Exhibition was filled with young kids from all of over the city, from different schools, different backgrounds and making them understand the workings of this robot, about what makes this unique, presenting to them the new ideas was really satisfying and we enjoyed it so much throughout the day.
We got acknowledged from different army personnelles, professors and other members from the upper eschelon of the education circle !
But after a whole day of work and presentation when the day came to an end.. we were finally awarded with the grand price for our effort !
It made our day !
Did
The whole experience of presenting our hard worked project was something that was truly unique and we learnt a lot from it !
And we are willing to do it again.
Epilogue • The End
So with great trouble and great fun, another project and another year comes to an end ! Our work here is done ! We are going to move on to work on something new and exciting - for the next video !
But we need you to give us the feedback of how this project really was and how it could have been improved !
If you have some suggestion, let us know under the comment section below.
Until next time, let's just say, we have un-finished work to do !
good bye, take care and keep creating :)
Here is the main video of the vlog. Don't forget to watch it :
Comments
Post a Comment