UTSA Doubles Down on Cloud Computing with the Cloud and Big Data Laboratory

Print Share on LinkedIn Comments More
This server can have between 20 and 40 processor cores. Your laptop is lucky to have four. Photo by Andrew Moore.

This server can have between 20 and 40 processor cores. Your laptop is lucky to have four. Photo by Andrew Moore.

As cloud computing continues to grow in importance, the University of Texas at San Antonio is putting its facilities and students on the cutting edge of cloud technology by investing in a new, multi-purpose cloud computing laboratory for the College of Science.

“Today we are launching one of the largest open clouds in academia here at UTSA, which utilizes Open Stack software and Open Compute hardware, to support advanced computing and data analytics research,” said UTSA’s Vice President of Research Mauli Agrawal.

UTSA President Ricardo Romo cuts the ceremonial ribbon, opening the UTSA Cloud and BigData Laboratory on March 7, 2014. Photo by Andrew Moore.

UTSA President Ricardo Romo cuts the ceremonial ribbon, opening the UTSA Cloud and BigData Laboratory on March 7, 2014. Photo by Andrew Moore.

UTSA President Ricardo Romo cut the ribbon on the Cloud and Big Data Laboratory today as part of the Open Cloud Symposium held on UTSA’s main campus.  More than 200 professionals met in the H-E-B University Center Ballroom for the first day of the symposium, which will continue through Thursday.

The event focused on cloud computing and big data technologies that students now will be able to take advantage of – and if you just hang in there with me, I promise to explain why this is important for UTSA students and our city.

The first thing you need to know about cloud computing is that it has absolutely nothing to do with white, puffy clouds and everything to do with servers – the computers that store the data, programs, and web pages which make up the Internet. If something is in “the cloud.” it is really just stored in big server farm – located at Google, Rackspace, Amazon, etc. – and is accessible on any Internet-connected device if you have the right software. If you’re using WordPress or Gmail, for example, you’re using the cloud.

More importantly, cloud computing companies can store websites and data and programs, and  also can maintain and  guarantee the availability of that website, program, etc. so that a business or individual doesn’t have to own their own server. If businesses already have a server infrastructure but need more data storage space or processing power, they can rent additional resources from cloud companies without incurring additional hardware costs.

Thanks to the Cloud and Big Data Laboratory, UTSA now has its very own open cloud, or big collection of servers. The UTSA cloud features 6,600 processor cores and 35 terabytes of ram — a massive amount of processing power – which can be used to analyze and process extremely large amounts of statistical or demographic information, or what techies like to call “Big Data.”

The UTSA Open Cloud is just one of two existing clouds in the Open Compute Project — the other being the Industrial Technology Research Institute (ITRI) in Taiwan. It is an “open” cloud because it is based on “open” technology. The cloud runs Rackspace Hosting’s OpenStack operating system, which Rackspace developed in partnership with NASA, and utilizes Open Compute hardware which was originally created by Facebook. The inner workings of both technologies are available to the public to use and modify as they see fit, as opposed to an Apple operating system which is only modifiable by Apple itself.

The UTSA Open Cloud offers several major benefits to students and faculty, starting with the fact that it is the most advanced on the market.

“UTSA now has state of the art hardware,” UTSA Open Compute Project Assistant Director Carlos Cardenes said. “When we got this hardware we got it before anyone else, Facebook was actually testing this out by the time we got the hardware.”

Students or faculty will now be able to use the processing power of the UTSA cloud to analyze large amounts of data for research on a much larger scale than what was previously available. This is useful for such information as demographic, genome, or finance data, which requires many servers to analyze effectively.  Being one of the largest open clouds in academia, the UTSA cloud will also help attract top researchers and move the university towards a Tier One research status.

The Students and faculty will be on the cutting edge of computer design thanks to the new Open Compute Product Certification and Solution Lab opening with the UTSA cloud. The lab will receive designs from across the country and put each through its paces. Open Compute Project Foundation President Frank Frankovsky, who was instrumental in creating the new lab, believes that the open-source, community based approach to server innovation is the best way to reach tomorrows technology.

“Whether it is CPU power or storage density, for years (technologies) have been on an upward trend. Those increases in performance and capacity are starting to flatten out.” Frankovsky said.  “Open efforts are what is going to hopefully crack though that plateau, so that we collectively figure out how to make the technology move forward faster.”

Open Compute Project Foundation President Frank Frankovsky

Open Compute Project Foundation President Frank Frankovsky. Photo by Andrew Moore.

But the students will do more than just test new server designs. Students and faculty will actually be involved in standardizing new server designs made by individuals or third parties by granting an OCP Certification to working designs. This will verify that the new design functions correctly in the many situations and environments of the computing world. This means that students will actually be able to see the new designs of tomorrow being created today, giving them a huge advantage after graduation.

“The students at UTSA will now be able to get their hands on the most leading technology that exists today from an IT perspective,” Fanskovsky said. “The designs that are being contributed to the UTSA lab are the most fresh ideas that exist in the IT computing space today. Providing the students with the ability to get hands-on time with these designs is really important for the competitiveness of the students who graduate from here.”

The Cloud and Big Data Laboratory was supported and funded by both Rackspace Hosting and the 80/20 Foundation – the personal philanthropy organization of Rackspace Chairman and CEO Graham Weston. Like many other tech initiatives in San Antonio, UTSA’s Cloud and Big Data Lab is a big step towards growing and retaining an educated tech community in our city. Rackspace CTO John Engates sees the new facilities at UTSA as a crucial spot where the tech industry and tech research will connect in the future.

“(Open Compute at UTSA) allows us to attract the right kinds of people who are going to be innovative and build things and also allows us to retain them – if they are going to go on and be faculty or go out into the ecosystem,” Engates said. “You need a virtuous cycle to emerge where people are churning out great ideas and people, and those people are going to start companies, and those companies are making money and attracting other workers to the area who stick around and stay in San Antonio.”

*Featured/top image: This server can have between 20 and 40 processor cores. Your laptop is lucky to have four. Photo by Andrew Moore.

One thought on “UTSA Doubles Down on Cloud Computing with the Cloud and Big Data Laboratory

  1. This is an interesting development but there are only two likes and one comment, mine. What’s up with that?

Leave a Reply

Your email address will not be published. Required fields are marked *