Saturday, June 11, 2016

Networking the Cloud for IoT - Pt. 2 Stressing the Cloud

Dwight Bues & Kevin Jackson

This is Part 2 of a three part series that addresses the need for a systems engineering approach to IoT and cloud network design. Part 1 is Networking the Cloud for IoT - Pt. 1: IoT and the Government.)


IoT: Unprecedented Stress on the Cloud and It’s Underlying Network

Karen Field, Penton Communications’ IoT Institute director, in her article “Start Small to Gain Big,” postulated an oil drilling platform with 30,000 sensors would generate about 1 Terabyte of data per day. She also stressed that only 1% of that data would likely be used. From a systems engineering point of view this data flow is multiplied by the trillions of other IoT sensors in the cloud, introducing unprecedented data processing and data transport stress. Industries and competing companies within those industries will also be forced to weigh the economic impact of paying for this transport and processing.

How will these parochial and business-centered decisions drive networking priorities across the cloud? Will all of the high-priority data get through? Will any data be lost? How will you know? If a piezo-electric sensor detects a crack in the drill pipe, will you get the notification, or will it get out-prioritized by the ambient air temperature reading that you get every 10 minutes? Every day, data gets delayed through the Internet and the results are not catastrophic. Tomorrow, though, a stock trade “trigger” could be delayed costing billions. Key economic indicators could be lost that could trigger large economic movements. As with today’s Internet, tomorrow’s IoT will need to ensure that the RIGHT data gets to its destination in a timely fashion.

Securing the IoT

Programming Research, in their white paper, “How IoT isMaking Security Imperative for all Embedded Software,” recommended that software developers should take a more careful approach to releasing new IoT products, “Security problems often stem from the need to accelerate development and bring new products to market ahead of the competition.  A majority of security vulnerabilities are a result of coding errors that go undetected in the development stage. CarnegieMellon’s Computer Emergency Response Team (CERT), in fact, found that 64% of vulnerabilities in the CERT National Vulnerability Database were the result of programming errors.”  The research firm also believes that software development organizations should incorporate coding standards such as CERT C and utilize the Common Weakness Enumeration (CWE) database.  Companies like Programming Research, Critical Software, or Jama Software offer tools to assist with static analysis of code against these standards.  Luckily, an increasing number of organizations are making adherence to these guidelines and standards a requirement for both internal development organizations and outsourced application development vendors.

Figure 1, from the TASC Institute “Peer Review” course, illustrates that software defects, although they are “facts of work” act like mines in a minefield.  Typical “Code and Test” methodologies effectively just clear a path through the minefield.  System overload, operator error, or race conditions could force the system off of the “cleared path” and into unexplored territory. This “unexplored territory” has, until recently, been the very place that commercial vendors installed their “back doors,” to enable the vendor to perform maintenance, collect metrics, or verify that the software is an authorized copy.  Commercial software vendors are now cracking down on these features because they represent security vulnerabilities that could be easily exploited by a hacker.



Figure 1: Defect Detection




 


Dwight Bues, of Engility Corp., is a Georgia Tech Computer Engineer with 30+ years' experience in computer hardware, software, and systems and interface design. He has worked in Power Generation, Communications, RF, Command/Control, and Test Systems. Dwight is a Certified Scrum Master and teaches courses in Architecture, Requirements, and IVV&T. He is also a certified Boating Safety instructor with the Commonwealth of Virginia and the United States Power Squadrons. He is currently working several STEM projects, sponsoring teams for competitions in the Aerospace Industries Association’s (AIA) Team America Rocketry Challenge (TARC) and the Robotics Education and Competition Foundation’s, Vex Skyrise Robotics Challenge.

Kevin L. Jackson is a globally recognized cloud computing expert, a cloud computing and cybersecurity Thought Leader for Dell and IBM and Founder/Author of the award winning “Cloud Musings” blog. Mr. Jackson has also been recognized as a “Top 100 Cybersecurity Influencer and Brand” by Onalytica (2015), a Huffington Post “Top 100 Cloud Computing Experts on Twitter” (2013), a “Top 50 Cloud Computing Blogger for IT Integrators” by CRN (2015) and a “Top 5 Must Read Cloud Blog” by BMC Software (2015). His first book, “GovCloud: Cloud Computing for the Business of Government” was published by Government Training Inc. and released in March 2011. His next publication, “Practical Cloud Security: A Cross Industry View”, will be released by Taylor & Francis in the spring of 2016
 

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)
 



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)



No comments:

Post a Comment