Thursday, October 20, 2011

Hopkins is Using Tech Right

This morning I went on to the CNN tech page to find a suitable article to write about on this blog. To my happy surprise, Hopkins was on the front page for their exceptional use of social media and interactive blogs to attract students. My friends Noah Guiberson and Lucie Fink were even featured in the article! But I digress; let's talk about what Hopkins is doing that's so fresh.

I'm not sure how many of you have heard of or used Hopkins Interactive, but it's a centralized location for the student blogs written about the various experiences of their four years at Hopkins. The blogs are updated daily (at least), keeping pace with the dynamic environment of Hopkins itself. Every week there is a guest blogger who is invited to give his perspective of Hopkins and the opportunities it has to offer, so it's not the same 9 people from each class each time. This allows prospective students to get a true view of what it means to be a Hopkins student, beyond the shiny brochures that are sent in the mail by every other college. HI is a way for Hopkins to stand out and be remembered.

But HI isn't the only way that Hopkins is using blogs to reach its prospective and current students. At the bottom of the HI page is a directory to four other blogs Hopkins runs, including blogs about the majors and minors offered from the point of view of those who study them. No gimmicks, no false appeals, just experiences on the web for the world to see and share. You can also read about what the office of Admissions is up to (especially now that college apps are flooding in!), the social events on campus, and the guest blogs.

And still Hopkins surpasses other schools with its effective use of Twitter. There are currently about 30 Twitter accounts associated with some department or branch of Hopkins, including the Applied Physics Laboratory, Peabody, and the famous Shush Lady. All this, while not always good, lets students connect to the campus both in et ex situ.

Congrats Hopkins Insider on your exemplary use of social media! Just another reason to be a Blue Jay and proud.

Friday, October 14, 2011

Adam Riess' Nobel Prize

Congratulations to our own Adam Riess, Krieger-Eisenhower Professor of Physics and Astronomy and staff member of the Space Telescope Science Institute, on his 2011 Nobel Prize in Physics for his work on Type Ia Supernovae and Cepheid stars, which led to the discovery of dark energy and that the universe is indeed expanding. We look forward to the next discovery to be made at Johns Hopkins. Hit those books people!

RIP Dennis Ritchie

Less than a week after the death of Apple's Steve Jobs comes the equally upsetting news that Dennis Ritchie, the man who developed the C programming language and influenced the Multics and Unix operating systems, died on October 8. A true trailblazer in coding and computer systems, Ritchie received the Turing Medal in 1983 and the National Medal of Technology 1998. When he retired from Lucent Technology Services in 2007, he was the head of the System Software Research Development office. He was 70 years old.

Ritchie was the son of prominent Bell Labs scientist Alistair E. Ritchie, himself a pioneer of switching circuit theory. Dennis Ritchie was educated at Harvard University where he received degrees in physics and applied mathematics. He later earned his Ph.D. from Harvard in 1968 after working briefly at Bell Labs. His mentor was Patrick C. Fischer

A modest man, Ritchie embodied the spirit of computing with his understanding of the simple systems that make complex machines. He will be missed.

Here are some quotes of his that exhibit his character and his views on life:
  1. I am not now, nor have I ever been, a member of the demigodic party."
  2. "Usenet is a strange place."
  3. "UNIX is very simple, it just needs a genius to understand its simplicity."
  4. "C is quirky, flawed, and an enormous success.



Friday, October 7, 2011

RIP Steve Jobs

The day after the iPhone 4S debuted, the face of the company died. Steve Jobs, who was able to rebuild the reputation of Apple, who became its face and its figurehead, passed on Wednesday, October 5 from pancreatic cancer. "Cancer took our Jobs!" wrote one Reddit user. This comment has risen to the top of the thread, for good reason. Steve, what a life. You will be missed.

Monday, October 3, 2011

Hot Air Isn't Just Nonsense

It's no secret that one of the hardest problems for America today is how to sustain our energy needs in the age of declining oil supplies. This has nothing to do with politics or reducing our dependence on the Middle East; this is a real problem that college students of today will inherit tomorrow. The most challenging part of this problem is that there's no one type of alternative energy that the entire American public has rallied behind. Sure, solar energy is flashy, but there isn't a mass push to make it more of a dependable source. This is partly because of a lack of cheap, mass-produced technology.

So why not take advantage of another thing that the world is full of: hot air?

That's exactly the idea of Australian entrepreneur Roger Davey, who is going to use the hot air that naturally rises off the Earth to power turbines, which will translate that energy into electricity. In an official statement, EnviroMission, Davey's company, boasts that it can create up to 200 megawatts of power, which is enough to power 100,000 homes. This is an extrapolation from data already collected. The Hot Air Technique has been tested on smaller scales on the plains of Spain, where prototypes of the solar updraft tower created about 50 killowatts of electricity per day. Some external engineers are skeptical of the projected 200 megawatt power ratings, but not one of them can deny the sound engineering of his device.

The tower will be surrounded by a plastic dome as wide as a football field, which will help heat air. When it reaches 194 degrees, the hot air enters the tower and travels up to the turbines. The increased height will increase the strength of the air flow, which will make the turbines turn faster, which leads to more electricity generated. Because the Earth continues to cool after the sun goes down, this is an energy that will not cease at night.

Unlike solar power, which is scarce on a cloudy day, and wind power, limited both at high and low wind speeds, the capture of hot air is something that happens every day without fail because of earth's natural specific heat. This is a solution that can be used everywhere, but especially in Australia. Another advantage is the long projected lifespan of the hot air towers, which EnviroMission says will be up to 80 years, far longer than a solar panel's lifespan.

As of right now, EnviroMission is the only publicly traded developer working on this job, but even with the encouraging data from Spain, the entire plan is still in the planning phase. Understandably, Oil companies have reason to be nervous, but Davey says the Hot Air is not meant to replace anything, yet. The current idea is that hot air will come to be a low emission alternative to fossil fuels, and, if the sentiment sticks, a primary source of fuel.


Friday, September 23, 2011

The Sky Really is Falling

We all know that the dinosaurs probably met their end when a giant asteroid came hurtling down from outer space and smashed into the Yucatan Peninsula. When the dust cleared, the dinosaurs were gone. Of course, this isn't the first time a mass extinction occurred, but no one ever expects something to fall from the sky. Well, expect the unexpected, because it's about to happen, and America is in its path.

The Upper Atmospheric Research Satellite (UARS) has been monitored by NASA since it began to slow down a few days ago, but the government agency still doesn't have an idea of where or when the satellite will come back into the atmosphere. This is not because the scientists are not paying attention; this satellite is unstable about its axis of rotation as it comes back to Earth, and when it's traveling at speeds ranging from 10 to hundreds of miles per hour, there's a lot of room for error.

In addition to the high speeds and unstable axis, the satellite is literally breaking up in front of NASA's eyes. NASA believes it will break up into a maximum of 26 pieces made of titanium and beryllium, metals that will heat up but not burn up in the atmosphere. Despite all these risks, NASA is fairly unconcerned.

This is not the first time that a satellite has come crashing down ahead of schedule. In 1979, a 70-ton space station called Skylab fell out of orbit, and pieces of it landed in Australia.  A woman in Oklahoma was walking with some friends at 3:30 in the morning in 1997 and got hit by space junk and sustained no injuries. While being hit by hot, heavy metal with high velocities is not a laughing matter, the chances of it hitting anyone are slim to none.

Since 70% of the Earth's surface is water, the chances of it crashing into land are diminished. Plus, since there is a chance the satellite or one of its pieces could hit a man-made structure, a lot of people are paying attention to where it's going. The Federal Aviation Administration released a notice on Thursday calling the satellite a "potential hazard," and asking all pilots and other aircraft operators to keep a sharp eye out for any falling debris. They are to report anything to the nearest traffic control center.

Once these pieces of metal enter the atmosphere, it will only be a few minutes before they hit the ground. So if you're looking up at the sky on Friday or Saturday (9/23 or 24) and you see something coming at you, run. It's not just a story anymore.

Monday, September 12, 2011

Robots Save The World

Welcome back to the Johns Hopkins STS Tech Blog! After a long summer hiatus we are back in the production room on campus, and ready for some exciting new projects to come our way. First off, in our own tribute to 9/11, the STS Tech Blog is going to analyze the good that technology did on and has done since that day.

There is no doubt that robots are some of the most efficient and beneficial tools we have to monitor and save our lives. They run heart monitors, control vehicles, deliver oxygen to firefighters, and help police communicate in times of danger and confusion. So why can't they also retrieve information from disaster sites that humans are unable to go to?

This is exactly what Robin Murphy of Texas A&M University thought on September 11, 2001. She and her team of computer scientists used their compact robots, called PackBots, to explore areas of Ground Zero that were still on fire, full of debris, or too small for humans to enter. The robots didn't have to worry about the heat or lack of oxygen, but their tank-like treads made it hard for them to move everywhere, and even quickly. This only made the A&M team more determined to make better robots to aid in human efforts to respond to disasters.

As the robots' technological sophistication increased, so did the Government's funding and media attention. When Hurricane Katrina hit New Orleans in late August 2005, flying robots were deployed to aerially search for people stranded on roofs or in boats, and because it was faster than people going out to look for victims, responders were able to save many more lives. After the BP Oil Spill in 2010, a robot called the Seaglider was able to dive to previously unattainable depths to monitor the spread of damage. It measured vital ecological data that will help marine biologists and petrologists get the Gulf back to equilibrium. Last March after the Fukushima Daiichi nuclear disaster (and earthquake and tsunami that accompanied it), robots were used to closely monitor the radiation levels and air temperature around the reactor sites.

Clearly, robots have an advantage over humans because they are unaffected by most physical extremes; however, there is still no substitute for human maneuverability and compassion. Again, this is just a promise of things to come in the near future. Some researchers are planning to build robots with more autonomy, but the trade-off is that the robots would look like worms or giant snakes, which may end up scaring victims more than helping them. It sounds a bit silly, but a disaster leaves people in very unstable emotional and physical states. Nothing is impossible if you've just lost your home.

We all remember the horrors of September 11. Now we have the chance to nurture the little good that came out of that rubble and allow it to help us all. Robots are already a huge part of our lives, and this is just the beginning of a new era of American history. 

Tuesday, May 3, 2011

VNCB is Launched

On April 28 members of STS and the Johns Hopkins Medical Administrative Board celebrated the official public launch of the Virtual New Clinical Building (VNCB) and began the official countdown to the opening of the physical NCB. There's no turning back now!

The VNCB is the result of an 18-month joint project between STS and Hopkins Medicine, made to make the transition between the physical building and the NCB easier for staff. In the Fall 2010 semester alone, about 4,000 student hours were worked on this project, doing the work of 10 full-time staff members at Johns Hopkins.In total, 23 students worked on this project in varying degrees of intensity with respect to the project, and they were all honored last week at the Medical Campus for their work on the VNCB.
Members of the STS Team and the JHMedicine Administrative Board
The Finished Product
For security reasons, no one outside of the Hopkins servers can view the finished product, but rest assured that it is a beautiful piece of technological art. If you're familiar with Virtual JHU, this is bigger and more detailed than that. It's no wonder it took so long to complete!
The head of the STS Team, Justin Tibbels
Two of our most enthusiastic teammates
This is more of a celebratory update than a blog post on recent research or new technological developments, but since this is such a big event, it's definitely worth congratulating the entire team on a job well done! Just in time for finals, we've finished the largest project ever undertaken by STS. Great job everyone!

Tuesday, April 26, 2011

Google Voice

Google is on fire.

The giant search engine is no longer just a fountain of quick, easily-accessed information; it has transformed itself into an email hot spot, an internet document storage bin, health and diet tracker, location tool, and now it is a phone. The product Google has been pushing recently is a feature that anyone with a gmail account can use: Google Voice.

Google Voice was initially started 2 years ago to centralize and organize the sometimes confusing world of cellphone data and calling. It's only just hit mainstream because, up until a few months ago, you needed a special invitation to be able to access the site. Google has expanded their program so now everyone can have a Voice account

While it is inherently cool to have people call your computer and speak to you as they would on a phone, if you are one of the many people today who has an operational cellphone, why would you bother with Google Voice? That's the beauty of Google Voice: it's not a phone service, it just uses your computer as an interface for your phone, allowing you to screen calls, "listen in" on voice-mail as it's being made, and view your voice-mail box as you would your email. And the number is tied to your account, so my understanding is that you can log in to any computer and still have access to your phone account. If you ever lose your phone, you could use your Google Voice account to call it, or to monitor calls made by a minor (or the person who might have taken your phone).

Google Voice is able to recognize that you may have multiple phones at your disposal, and it will sync the voice-mailboxes into one spot, transcribed in an email or a text message, should you so choose. Based on the identity of the caller, or the time of day that a call is made, Google Voice also acts as a call blocker, preventing telemarketers from calling at 3 in the morning (it's happened to me, and it wasn't pleasant). Google Voice will also ring all of those phones at the same time, so if you're really waiting for that call, it's the best way to ensure you don't miss it.

There are other cool features that Google Voice has, such as conference calling and the ability to record a call. Google Voice warns that there are Federal and state laws that may prevent you from legally recording a call without consent from both parties, but the fact that you can is extraordinarily empowering. As I mentioned before, the voice-mails will also be transcribed into your Google account, and if you ever need to search for a small piece of information, it's never more than a search bar away.You'll also be able to send those transcribed messages to any other email account you want.

Although Google Voice is only available at this time in the United States, it doesn't prevent international calls from being made. Google Voice works with any mobile carrier, and for the year of 2011 the service will be free, including texts. If you ever switch your phone, carrier, or location, the Google number you chose will remain the same because "it's tied to you."

Thursday, April 21, 2011

The Smart Security Grid

One of the biggest issues of the modern day is, without a doubt, energy. We are obsessed with how to use less and generate more of it, or at least take what we have and use it more efficiently. The Obama Administration has noticed this (of course), and expanded the American Recovery and Reinvestment Act to provide for the development of new electric grid technology, which holds the key to advanced energy management and pecuniary savings. Because it is such a large undertaking, the project also promises job creation and energy independence. But this sounds too good to be true, so what's the catch?

The downside to a giant grid is lack of security. The grid is obviously an interconnected web of networks that would be extremely vulnerable to attack from hackers and worms in the system, and because it would be such a central part of how the country runs itself, any attack could result in a massive loss of electrical services. Attack isn't the only downside to a large grid; this kind of project has never been done on such a large scale, and there are definitely unforeseen problems in putting so much energy into one place.

Even if all these disadvantages are disregarded for a moment, there is still the problem of how to run the grid itself. With so many different utilities, providers, and consumers scattered across it, the need for a consistent operating system is imperative. Which system should be used? Experts are still working on that one, and it's not as easy as it sounds. Even if a framework was agreed upon, it would almost immediately have to be updated because of the rapid pace at which technology in general changes. The grid would need to evolve itself constantly to keep pace with demand.

In order for this project to be worked out, the general public of consumers must be educated about what this change could mean for themselves. Many times we will just see the good of a situation and blind ourselves to the bad. "Oh, better energy management? Sign us up!" we might be tempted to say, but the truth is that this issue is about more than reducing our carbon footprint. We might end up paying more (in dollars) for the giant energy grid, but because no one wants to pay more for something they already have, the majority of people might not even bother to switch. And that's not even considering the omnipresent issue of a lack of a generalized framework and cybersecurity network in place yet.

Do you know how much information your energy provider has about you? The answer is a lot. Any hacker or worm good enough to crack into the (currently) weak security system in an energy grid could access your information easily and quickly, and because energy companies are extremely wary of sending out any information about security attacks, you might not even know until it's too late.

This is a rather serious post this week about the dangers of technology, but it's here to highlight the need to understand all sides of an issue before we commit to something on a grand scale. Lesson of the day: do your research.

Monday, April 11, 2011

A New Force To Be Reckoned

Last week physicists working at the Tevatron Accelerator at the Fermi National Accelerator Laboratory in Chicago, Illinois, may have encountered the greatest discovery in modern physics since the Manhattan Project during WWII. That new discovery could create quite a stir for particle physicists everywhere, mostly because this "new" particle was wholly unexpected. No one even knows what it is, and because of the sheer surprise of the data, some question its legitimacy. The scientists who ran the experiment estimate its potential error at less than .25% probability, but it's still not enough to publish a hard-core report. Assuming the data are real, this discovery introduces more problems than it solves: Why didn't anyone predict its arrival? How does it fit in with our idea of the atomic model? What is it?

Current speculation of this particle's identity is a variant form of the infamous Higgs Boson, which scientists have been hoping to view in a collision since its proposed existence in the 1990s. The Higgs Boson is theorized to be responsible for the mass found in other particles, and since any freshman physics student knows that energy can never be created or destroyed, the discovery of the Higgs Boson will elucidate the beginning of the universe and show scientists how mass came to exist.

Cool, right?

Of course, with so many different physicists from all over the world working on this project, there are many alternate theories as to what this could be, one of which is that this is a new physical force which works only a very small distances, such as the strong nuclear forces that holds atoms together. Yet another theory is that we have constantly missed something in our hazy understanding of physics, and this is the first clue as to what that "something" is. If this is true, it could have major implications in a major issue in today's physics: how to unify the theory of quantum gravity.

Because the data are so small and, as of now, unsupported, more tests are absolutely essential in the study of this new particle (or force, or representation of a model heretofore unknown). Unfortunately for every particle physicist on earth, the Tevatron is going to be shut down sometime this year, as late as September and as early as when FermiLab runs out of funding money. If allowed to run this same test again, the FermiLab group would have about four times as much data to analyze, enough for either a refutation or a proposal.

So maybe with all the new cloud computing software going into effect, the Government will be able to spare that money and give it to the scientists at Tevatron. Who knows? This could be the dawn of a new age in particle physics.

Tuesday, April 5, 2011

Adobe Design Achievement Awards

Like any company who produces excellent work, we too like to receive credit and admiration for our work. This is especially the case for the project we have been working on until very recently. Because it is a system bought by Johns Hopkins Medicine, and therefore its property, the actual project cannot be described here; however, we are still able to get credit through the Adobe Design Achievement Awards (ADAA).

ADAA is a competition sponsored by Adobe (the guys who wrote Flash and make PDFs for you) that will award a grand prize of $3000 to the winning individual or team, along with thousands of dollars worth of Adobe products and the recognition of having won. The finalists are flown to Taipei to present at the International Design Alliance, and it is in Taipei that the winner will be chosen. Congratulations Hopkins, your STS team has entered into this contest!

The best part is that anyone who has an idea and executes it has the ability to enter this contest. The rules are that you be a registered student at an accredited institution, the work is your own (and only your own, unless you include all the members of a team), and that out of all the tools you used to create it, half of them are Adobe systems.

There are 15 categories to choose from, although only 12 are feasible (unless you are a teacher, then you have all 15). These categories range from photography and video to browser-based design and new game ideas. The point is that anybody, given a specific amount of free time and access to Adobe products, can enter this competition.

There are three waves of entry submission dates, the first of which has already passed. The next deadline is on April 29, and the final wave starts on April 30 and ends on June 24. If you'd like to be creative and possibly win a trip to Taipei, work on a project in the month after school ends and try your luck!

We are very happy with the end result of our project and hope for the best. Good luck to all!

Tuesday, March 29, 2011

Linux Operating System

When you turn your computer on, you may not be aware of the large amount of communication going on inside of it. A lot of the communication that happens has to do with the operating system (OS) installed on the computer and the applications you might be running. The basis of an OS is software consisting of data and programs that is used to relay information between the hardware and applications. Any computer, including video game consoles and cellphones, uses some type of OS, whether it be Unix, Microsoft Windows, Mac OS X, and, our topic of today, Linux.

Linux first made its debut at the hands of Linus Torvalds in 1991, a then-21 year old Finnish student at the University of Helsinki. Torvalds wrote the Linux kernel so that it would be the first free and open source OS for computers, in contrast to the older Unix and MINIX operating systems. Unix and MINIX both restricted the use of their codes, limiting it only to academic institutions and businesses, with MINIX used more to teach students about OS than for its useful execution outside of the classroom.

The kernel of an operating system is the link between applications on the computer and the hardware, specifically the CPU, memory, and devices. The Linux kernel was first written in the programming language C, but it has been expanded to work with almost any other programming language, including C++, Java, PHP, Python, and Fortran. It is a monolithic kernel, which means that executes all the operating system code in the same address space to increase the performance of the system.

Linux is mostly used on servers, which is how Linux ended up running the 10 fastest supercomputers in the world, but its popularity on desktops has increased in recent years, possibly as a result of a 2007 "rebellion" in the Linux development community. As a result, many applications that used to run on only Windows or Mac OS have now been "translated" to Linux, including popular games such as World of Warcraft and Team Fortress 2, and more mundane applications such as Mozilla's FireFox internet browser.

However, if you're the kind of person who likes to look at code, the prospects for the Linux market are growing rapidly. The latest version of the Linux kernel is about 13 million lines of code, and with the ever-present advances in technology, the next one will be longer and more complex. Linux is definitely something to watch for and understand in the world of computing and technology.

Tuesday, March 15, 2011

The Switch to IPv6

Forget 2012, because it's 2011 that is the end of the IPv4 world in which we've all grown up. The Internet Assigned Numbers Authority, the organization which controls and coordinates global IP addressing for every computer on earth, is running out of IPv4 (stands for Internet Protocol version 4) addresses. As of February 2011, it was down to the last 2.7% of them.

So does this mean that new computers will become unusable? Of course not. The solution for this problem involves a little creativity, and a lot of new space.
The gradual transition to IPv6. From http://www.potaroo.net/ispcol/2008-09/fig1.jpg
 To explain this better, imagine that all the IPv4 addresses are like the apartments in New York City. There's a limit to how many there are, and once they're filled, they stay there for good. Any technology device, including printers, has an IP address to tell other computers where to direct their information, similar to the mail service when you want to send a letter to a friend. Unlike people, however, the computers never change their addresses, and so this had led to the problem equivalent to overpopulation. So, instead of deleting computers from the system, computer scientists have been building New York City 2.0, which they have called IPv6.

IPv4 is a 32-bit internet layer protocol which handles the transfer of information through "routes," or pathways on the internet. Every piece of information shared online has three core data points that direct it: who sent it (IP address 1), where it's going (IP address 2), and how it's getting there (the route). This system was created in the 1980s, far before the internet got so huge, and long before anyone predicted it would run out. After all, it accounts for 2^32 addresses! Side note: remember that computers use a binary (base of 2) system to communicate.

IPv6, on the other hand, is a 128-bit internet layer protocol which can support 2^128 addresses, which is about equal to the impossibly large number 340 undecillion (3.4 x 10^38). This expansion allows for many more devices and users on the internet as well as extra flexibility in allocating addresses and efficiency for routing traffic. As of now, IPv6 is still in the pilot stage, but on June 8, 2011, this will change. Why? It's World IPv6 Day, of course!

On that day, the entire globe will be undergoing a test of the major IPv6 infrastructures. Many of the most commonly used websites (Facebook, Google, etc.) have or will enable a "gateway translation," which will allow IPv4 users to connect to IPv6. Unfortunately, .05% of people who are covered by IPv4 will experience major technical issues on this day. These are the problems that World IPv6 Day is trying to fix before the entire IPv6 is launched full-scale, which might happen in a decade or so.

So don't worry about losing internet access yet, unless you are part of the .05% group who has issues with the switch to IPv6. The point is that it's out there, and we're ready.

To all Johns Hopkins students, have a safe, restful, and fun Spring Break. The blog will return in 2 weeks. Until then, enjoy this small joy.

Wednesday, March 9, 2011

Cloud Computing Part 2: Applications

So a major question you should be asking right now is: Why are we so worked up about cloud computing now if it's been around and in use for a couple of years already? There are two answers to this question. 1) Microsoft has now become a cloud supplier, and since they've been doing a lot of advertising about it lately, the idea has finally reached the public's ears.

The second reason is far less fickle than the public's attention: The Obama Administration has asked agencies to cut the number of federal data centers by 40% in five years and identify 3 applications to move to a cloud in the next 18 months, one of which must be moved successfully in only 12.

This means that the U.S. Government will be on clouds (and servers) which could potentially be accessible by anyone. Obviously there's an issue with confidentiality, but ever since WikiLeaks went live, transparency in the Government is sky high. Now, they decided, is a good time to switch to cloud computing, and for a number of different reasons, one of which is cost effectiveness. Because cloud computing is mostly cheaper than owning multiple copies of the same software, the United States Government is cutting their technology budget, however marginally. This doesn't mean that taxes are going down anytime soon, but it does mean that the deficit won't be as large as it would have been without the cloud.

The question that the Government is now asking is which cloud provider will best meet their needs. The General Service Provider (GSA) is the odds-on favorite (mostly because it's also controlled by the Government) for most departments, but since each department is run individually, the choices will be varied. Some may choose public cloud providers such as Amazon or Salesforce, and some may even choose Microsoft.

The most technically-savvy of the departments will use their own cloud because they are equipped with the infrastructure and tech support to have it. For example, NASA's cloud, nicknamed Nebula, is an open-source cloud computing service specifically designed for NASA scientists and data processing. The Department of Energy (DOE) will be using its own Magellan program due to the specificity needed for their research problems, and the lack of this technology on public clouds.

Other, smaller municipalities are heading in this direction as well. New Mexico is looking to reduce IT and electricity costs with cloud computing, and in 2010 New York City made a deal with Microsoft for a single, citywide license under which the city will pay only for the applications that city employees actually use.

So the next time someone says you have your head in the clouds, retaliate by saying "That's where all the information is." You wouldn't be lying either.

Wednesday, March 2, 2011

Cloud Computing Part 1: Basics

To the cloud?

That's where everyone, from commercial users to large companies to the United States Government, seems to be going. Although the concept of cloud computing is relatively new, it has impacted the way we all think about data storage and program use, even if you have no idea what it is. You are not alone in this oblivion; until yesterday, this blogger had no idea what cloud computing was, but I knew I needed to find out.

This is precisely the reason that this week's article is about cloud computing: few people have a clear idea of what it is, but it's about to change everything about how we work with others.

http://tomlambert.com/cloud-computing-will-rule-the-world/
Cloud computing is best thought of as "pools" of computer resources and data storage in which each pool is run by an individual, major company, such as Google or Microsoft. The programs offered by the company will be accessible to every computer in the cloud, just as it is now on the internet or through software installation, but the best part is that you don't need to buy individuals resources and there are no compatibility issues. Essentially, the hardware itself becomes virtualized. Continuing with the pool analogy, it's more like renting a kick-board or flippers in the pool instead of buying your own. This greatly reduces the cost of operating large (and small) businesses, one of the benefits of cloud computing.

Yet another useful feature of the cloud is that any computer within the cloud can be accessed by another computer (hopefully with a password of sorts to prevent any kind of unauthorized access). Aware of the security breaches this kind of openness would cause for its users, the companies who create and maintain the clouds are extremely careful and sensitive to any kind of problem. While these systems seem prime targets for attack, companies recognize this and take extensive measures to prevent it from happening.Specialized teams at cloud headquarters act as highly observant lifeguards on the edges of the pool, ready to jump and save any piece of data that's in danger. In truth, even if a malicious person wanted to bring down the entire cloud cluster, there are so many nodes (sources of data) that it would take far too long for him to succeed.

One of the best parts of the cloud is the almost-endless data storage. While users pay for the ability to store data on a cloud, it's actually a better investment than having multiple back-up files on flash drives and external storage units. Even if you have multiple computers, the cloud will make it so that you have multiple machines, with their files and their programs, on the one computer you are using at that moment.

The layers of a cloud
To gain access to a cloud, first you need to purchase it from a vendor, which could be Microsoft's Azure, Google's App Engine, or any other one not mentioned here. It is a matter of best fit for you and what you need to be able to do on the cloud. Your computer, the client, is able to interact with the application on the cloud in the same way you, as a human, interact with your computer's normal (not cloud) programs. The platform works with the cloud's infrastructure, and controls the deployment of programs, minus the cost and extensive tech support. The infrastructure is the cloud itself, and it varies for each cloud's operating system, but the overall goal is to connect it to a massive server, which is where the data and programming is stored. These are maintained at buildings far away from where your computer is, so you don't have to worry about it at all.Although there are certain parts of each level which interact with each other, the majority of the information is trapped in one layer, minimizing the potential for security breaches and system downtime. The clouds are so effective that it is successful and operative 99.999% of the time (which means it's down for about 31 seconds every year).

Overall, the cloud seems to have so many benefits that even the Obama Administration has declared that the number of federal data centers will be cut by 40% in five years (about 420 of them in total), and that various agencies "identify applications to move to the cloud within 18 months," according to Government Computer News magazine. There will be more about this in next week's blog article. As for now, you've been inundated with enough cloud information.

Special thanks to Eric Caruso of Brown University for his help in explaining to me (in simple English) what a cloud is.

Tuesday, February 22, 2011

Future Projects for STS

If you've been an avid reader of this blog, you may have noticed that the last few entries have not been about JHU or STS. The reason for this is because the entire department has been working on an outside project which, after 20 months of solid work, is finally coming to a close. So here's a list of the projects that STS is planning to work on for the remainder of 2011:

  1. Modeling the Bloomberg School of Public Health: Like we did for the wildly successful Virtual JHU, STS would like to create a Google plug-in of the Bloomberg School of Public Health in downtown Baltimore. This project would likely take about 6 months, depending on the amount of hours the students are able to spend working on it, and assuming no other major projects are given to us.
  2. Reducing the size of the Virtual JHU program to make it faster and easier to load: pretty much what it sounds like. In technology, there's always room for improvement.
  3. Collegetown Shuttle Tracker: Have you ever used the Collegetown Shuttle? This project would take a 3D Flash PDF to show you where it is using a coordinated GPS tracking system. This would let students follow its path to know exactly where it is and where it makes its stops. This resource would be available for use by all students who use the Collegetown shuttle, including those at other schools. For security purposes we will require a log-in screen that will need a viable college ID to access the tracking page, but we feel that this will go far.
  4. Continuation of the STS TechBlog updates.
  5. Meeting the needs of those around campus who need our skills, such as:
    1. Creating fliers for official campus activities and meetings.
    2. Assisting students with technology problems.
    3. Writing technical manuals for new freshmen and transfer students.
  6. Updating our technology pages to make them more aesthetically pleasing. 
  7. Training new employees in STS tasks.
That's pretty much what we do here. If you would like to learn more, feel free to leave a comment.

Tuesday, February 15, 2011

Human vs. Computer: The Proof at Last

Although man created them, machines have always threatened humanity's superiority, and many movies, such as The Matrix and I, Robot, argue that they will eventually overtake us. After years of uncertainty, humans will now be able to discover if they can continue to develop increasingly sophisticated programs, or if they should enter into technological isolation. And it will be decided on television. From February 14th to 16th, the classic game show Jeopardy will have a new contestant: IBM's artificial intelligence program, nicknamed Watson after IBM's founder, Thomas J. Watson.

Watson is truly a feat of computer engineering, and regardless of whether or not it triumphs over its competitors, it should be appreciated as a historic piece of technological history. First, Watson analyzes the clue, taking into account any colloquial phrases or homo-phonetic hints. It then chooses its answer by running an enormous amount of algorithms at the same time, allowing each to come to an independent answer. The more algorithms that land on the same answer, the more likely it is that the answer is correct. The algorithmic results are then checked against a database. Watson is then able to rephrase his answers as questions, true to the game's format.

The database itself is absolutely huge. Watson's expanse of knowledge is stored on 90 IBM Power 750 servers with a total of 2880 POWER7 processor cores and 16 Terabytes of RAM (16x10^12 bytes). It is an expansion of IBM's long-term project of DeepQA, which is designed to handle hypothesis generation, massive evidence gathering, analysis, and, in this case, scoring. As can be imagined, Watson is a physically huge machine as well, and the "avatar" the public sees on television is only the front-end of a massive expanse of wires and humming, mechanical boxes on another floor.

Watson was first tested in 2006 by IBM's senior manager of Semantic Analysis and Integration department, Dr. David Ferrucci. At that time, it was only able to get about 15% of questions right, and it took much more than the 6-8 seconds the human competitors needed to think of the answer. To build up its knowledge, a team of 15 IBM staff members and faculty and students at eight different Universities fed it 200 million pages of documents, including plays, novels, dictionaries, encyclopedias, and even the Bible. By 2008, Watson's developers thought it was ready to challenge true Jeopardy contestants. That year, IBM contacted Jeopardy's producers, who agreed to have Watson on the show.

However, before Watson could be on the show, he had to train. By February 2010, it was consistently beating former Jeopardy contestants in mock competitions. On January 13, 2011 Watson beat his future competitors, Ken Jennings and Brad Rutter, in a practice round. Not one of the contestants, human or machine, answered a question wrong. The winner of the three days tournament will win $1 million, second will win $300,000, and third $200,000. Watson's winnings will be donated to charity, while Rutter and Jennings will each donate 50%.

So what's next for Watson after Jeopardy? IBM's researchers believe that Watson and the technology he represents has the power "to revolutionize many industries," including private banking, telephone operating systems, and medicine. Soon, it will be possible to diagnose diseases quickly and confidently, increasing the level of health care around the world. More data means smarter decisions, which means a better planet, they argue. There is no limit to the good Watson and his descendants can do.

Of course, if you're still worried about computers taking over the world, just remember: the documents Watson uses for his information all come from the minds and experiments of humans.

Tuesday, February 8, 2011

Top Careers in Technology

College is a time to expand outside of your main comfort zone, a time to explore your interests, and a time to explain to your parents why they should pay for you to get this particular degree. While majoring in something does not necessarily mean that you are going into that field directly (for example, physics majors are frequently hired by banking firms on Wall Street to run analyses), there is no question that majoring in a technological major will help you get a technological job. Here are some of the top careers in technology for 2011, some which are more obvious than others.

Biomedical Engineering: Biomedical engineers work alongside doctors to provide life-saving technology, such as artificial hearts, prosthetic limbs, diabetes meters and medication, and asthma inhalers. Thanks to the work of doctors and these scientists, people around the world can enjoy a higher quality of life, as well as a longer one. It just so happens that Johns Hopkins has one of the best BME programs in the world, so if you're not yet in college and interested in BME, check this out: http://www.bme.jhu.edu/academics/applications.htm

Computer Software Engineers: As you can imagine, computer software engineers write code for a variety of reasons: making iPhone apps, video games, and military applications, debugging programs already written, and launching new software.  Many of these engineers are finding work in Google and Facebook, which are constantly expanding and growing. Eventually, every field will need to know coding, but these guys blow them all out of the water.

Atmospheric Scientists: Speaking of water, if weather fascinates you, then consider a career as an atmospheric scientist, more commonly known as a meteorologist.  Some atmospheric scientists work at the Weather Channel, daily updating the public about weather, others study how to better predict the strength of hurricanes, and still others study odd weather patterns.Especially as global temperature begins to climb, climatologists are going to be in high demand, especially at the National Weather Service.

Civil Engineers: Civil engineers are able to leave their marks on society for years after they finish a project, so long as they build them right! They can work for the government or for a private company, which makes this one of the most versatile careers. Some even go overseas to build major projects in the Middle East, India, and China! Although some say that the job is stressful because of all the paperwork, the growth prospect is great.

Security Engineers: With more and more people using the Internet, the risk for identity theft is increasing. Security engineers write software that keeps your computer and files safe from hackers, manages interconnecting databases, and safeguards data transfers. They are constantly updating their systems to stay ahead of crime, and the job growth is projected at 15% over the next few years.

These are only five of the thousands of jobs that exist in a number of varied fields, and it's true that there are many jobs that will be available in 10 years that don't even exist now. If you remember to follow your passion, a career opportunity will present itself out of the major you choose.

Tuesday, February 1, 2011

The Verizon iPhone vs. iPhone 5

Photo credit: http://www.tipb.com/verizon-iphone/
On January 8, 2007, thousands lined up outside of Apple stores around the globe in the hope of purchasing the first iPhone. The highly anticipated mini-computer was easily the most exciting development in modern technology since the iPod. Today, a little more than four years from that cold winter morning, Americans anxiously await the unveiling of the newest addition to the iPhone dynasty: the Verizon iPhone 4. Although not yet available for purchase, the Verizon iPhone will be available for pre-order on February 3 to AT&T iPhone owners, and released on February 10th, 2011.

While every iPhone comes with standard features such as Wifi and 3G connectivity, a video and still frame camera, a portable media player, and the ability to make a call and surf the web at the same time, the Verizon iPhone comes with a few features unique to it. Just a few are a high definition video camera, a message indicator light, a longer battery life, a unified email storage system, built-in GPS, and iChat. Plus you get the benefit of better coverage, at least in the Northeast region of the United States, including Baltimore. One of the biggest complaints about AT&T’s coverage in this region is the iPhone tends to drop calls after only a few minutes. Unfortunately, an owner of the Verizon iPhone will not be able to use the iPhone overseas, and he won’t be able to use the web and make a call at the same time. He’ll also have to deal with Verizon’s notoriously less friendly customer service center if he has problems.

Overall, Verizon thinks it’s ready to handle the addition of the iPhone on its network. When AT&T began to cover the iPhone, the network became slow and congested due to call volume, but Verizon has learned from AT&T’s mistakes and adapted.  The price is $199 for the 16GB version and $299 for the 32GB. The actual data plans prices vary, and some AT&T customers looking to switch to Verizon may have to pay early-termination fees. But no matter what the price, the Verizon iPhone is sure to make a tidal wave in the technology pool.

Don’t expect Apple and AT&T to just roll over and cede the field to Verizon. In July 2011 the AT&T network is going to launch the iPhone 5, which is the most feature-stacked version yet, including the Verizon iPhone 4. The iPhone 5 will have new additions such as face-recognition security features, video chat, and custom SMS tones, in addition to its new physical appearance with a slim 9.3 mm width and scratch-proof, shatter-proof screen.  The camera, video recorder, and audio will all be in high definition, and the battery will last for 7 hours of talk time on the 4G network. The biggest turn-off for this device is the AT&T network attachment, but in this case, the features might outweigh the occasional dropped call, especially if the user is more prone to texting than talking.

With the constant flow of new Android and “smart phones” on the market today, the biggest fish is now in civil war against itself. Hopefully the other fish, such as the HTC Evo and the Blackberry Torch, have enough force behind them to stay in the water.