Welcome back to the Johns Hopkins STS Tech Blog! After a long summer hiatus we are back in the production room on campus, and ready for some exciting new projects to come our way. First off, in our own tribute to 9/11, the STS Tech Blog is going to analyze the good that technology did on and has done since that day.
There is no doubt that robots are some of the most efficient and beneficial tools we have to monitor and save our lives. They run heart monitors, control vehicles, deliver oxygen to firefighters, and help police communicate in times of danger and confusion. So why can't they also retrieve information from disaster sites that humans are unable to go to?
This is exactly what Robin Murphy of Texas A&M University thought on September 11, 2001. She and her team of computer scientists used their compact robots, called PackBots, to explore areas of Ground Zero that were still on fire, full of debris, or too small for humans to enter. The robots didn't have to worry about the heat or lack of oxygen, but their tank-like treads made it hard for them to move everywhere, and even quickly. This only made the A&M team more determined to make better robots to aid in human efforts to respond to disasters.
As the robots' technological sophistication increased, so did the Government's funding and media attention. When Hurricane Katrina hit New Orleans in late August 2005, flying robots were deployed to aerially search for people stranded on roofs or in boats, and because it was faster than people going out to look for victims, responders were able to save many more lives. After the BP Oil Spill in 2010, a robot called the Seaglider was able to dive to previously unattainable depths to monitor the spread of damage. It measured vital ecological data that will help marine biologists and petrologists get the Gulf back to equilibrium. Last March after the Fukushima Daiichi nuclear disaster (and earthquake and tsunami that accompanied it), robots were used to closely monitor the radiation levels and air temperature around the reactor sites.
Clearly, robots have an advantage over humans because they are unaffected by most physical extremes; however, there is still no substitute for human maneuverability and compassion. Again, this is just a promise of things to come in the near future. Some researchers are planning to build robots with more autonomy, but the trade-off is that the robots would look like worms or giant snakes, which may end up scaring victims more than helping them. It sounds a bit silly, but a disaster leaves people in very unstable emotional and physical states. Nothing is impossible if you've just lost your home.
We all remember the horrors of September 11. Now we have the chance to nurture the little good that came out of that rubble and allow it to help us all. Robots are already a huge part of our lives, and this is just the beginning of a new era of American history.
There is no doubt that robots are some of the most efficient and beneficial tools we have to monitor and save our lives. They run heart monitors, control vehicles, deliver oxygen to firefighters, and help police communicate in times of danger and confusion. So why can't they also retrieve information from disaster sites that humans are unable to go to?
This is exactly what Robin Murphy of Texas A&M University thought on September 11, 2001. She and her team of computer scientists used their compact robots, called PackBots, to explore areas of Ground Zero that were still on fire, full of debris, or too small for humans to enter. The robots didn't have to worry about the heat or lack of oxygen, but their tank-like treads made it hard for them to move everywhere, and even quickly. This only made the A&M team more determined to make better robots to aid in human efforts to respond to disasters.
As the robots' technological sophistication increased, so did the Government's funding and media attention. When Hurricane Katrina hit New Orleans in late August 2005, flying robots were deployed to aerially search for people stranded on roofs or in boats, and because it was faster than people going out to look for victims, responders were able to save many more lives. After the BP Oil Spill in 2010, a robot called the Seaglider was able to dive to previously unattainable depths to monitor the spread of damage. It measured vital ecological data that will help marine biologists and petrologists get the Gulf back to equilibrium. Last March after the Fukushima Daiichi nuclear disaster (and earthquake and tsunami that accompanied it), robots were used to closely monitor the radiation levels and air temperature around the reactor sites.
Clearly, robots have an advantage over humans because they are unaffected by most physical extremes; however, there is still no substitute for human maneuverability and compassion. Again, this is just a promise of things to come in the near future. Some researchers are planning to build robots with more autonomy, but the trade-off is that the robots would look like worms or giant snakes, which may end up scaring victims more than helping them. It sounds a bit silly, but a disaster leaves people in very unstable emotional and physical states. Nothing is impossible if you've just lost your home.
We all remember the horrors of September 11. Now we have the chance to nurture the little good that came out of that rubble and allow it to help us all. Robots are already a huge part of our lives, and this is just the beginning of a new era of American history.
No comments:
Post a Comment