The best Technology Blogs

The Best Tech blogs
Tech News and Analysis
ZDNet– This blog is one of the best resources for IT professionals to find information about products, trends, optimization tips and more.  ZDNet has editions for 13 different geographic regions, providing you with important news from around the world, as well as local information most relevant to you.
Tech Republic– This website is not just a single blog; the blogging section covers a wide scope of topics including everything from a Tech Sanity Check to CIO Insights. Definitely worth a look!
The Next Web– A leading publication on internet technology, business and culture.  Their conference information alone is enough of a reason to check them out. Every two weeks they update the site with a list with all of the best conferences from around the world.
CIO– Here’s another site that hosts multiple blogs; beyond that, they’ve also got a section for whitepapers, a job board, and plenty of extensive research and analysis.
CNET :  CNET editors bring you the very best in gadgets and tech, sure to bring a smile to anyone’s face this holiday. These are the gifts we are buying our friends and family.
TechCrunch – LOTS of news  always interesting and frequently first.
VentureBeat – A firehose of news about startups, innovation, and cool products coming in the tech world.
GigaOm – Some news, but mostly really smart analysis of the tech world.
ReadWriteWeb–   Some news, lots of smart people talking tech.
Wired– Less about startups and the valley, more about tech and real life.
Webware – News, always with a focus on “what does it mean for users?”
Mashable – Nowhere else to go for social media news, this one’s dominant.
Bits – Not a lot of exclusive content, but brilliant writers and smart commentary.
The Unofficial Apple Weblog – An awesome resource for anything and everything Apple.
Business Insider Tech – Very business-heavy, but great angles on news stories.
Ars Technica – Heavy on the geekery, but full of interesting thoughts and niche pieces.
Reddit Technology –  great source for tech news and other edifying and entertaining content.
MIT Technology review

 

Andrew McAfee’s Blog

Andrew McAfee

Written by McAfee, a principal research scientist at MIT, this blog is an exploration of the intersections between business and IT. As the man who coined the phrase “Enterprise 2.0,” McAfee is well versed on the subject, and the proof is in his posts, which offer critical insights and analysis on trends, news and topics in business IT.

Follow: @amcafee | Read the blog: andrewmcafee.org/blog

Around the Storage Block

HP Storage Guy

Maintained by Calvin Zito, an IT professional since 1983, this blog explores all things storage within the HP universe. Zito offers tips, product announcements and multimedia podcasts for all storage enthusiasts.

Follow: @HPStorageGuy | Read the blog: Around the Storage Block

 

Chuck’s Blog

Chuck Hollis

EMC doesn’t just offer top-tier products, they’ve also got some of the best minds in IT on their staff, and Chuck Hollis, global marketing vice president and CTO, is one of them. His posts are full of grand ideas and offer a big-picture point of view on IT.

Follow: @chuckhollis | Read the blog: chucksblog.emc.com

 

CIO Dashboard

Chris Curran

Targeted at CIOs and CTOs, the CIO Dashboard focuses on IT news and topics most useful to the leaders in business IT. Written by Chris Curran, a principal at PwC (formerly PricewaterhouseCoopers), this blog cuts straight to the chase and offers up practical ideas and advice.

Follow: @cbcurran | Read the blog: ciodashboard.com

 

Cisco Blog – Small Biz

Cisco Blog - Small Business

Cisco Systems may be a big company, but they’ve got a group of folks that really get small business. This blog brings together experts in small-biz IT who inform readers on major aspects of running and growing a small business.

Follow: @CiscoGeeks | Read the blog: blogs.cisco.com/category/smallbusiness

 

The Citrix Blog

Citrix blog community

As an authority on virtualization and cloud computing, Citrix knows a thing or two about virtual, flexible infrastructures. Readers of the blogs in the Citrix network can expect to find strategies, news and advice about the use of Citrix platforms.

Follow: @Citrix | Read the blog: blogs.citrix.com

 

CloudAve

Cloud Ave

Got cloud fever? Then it’s time to take a stroll down CloudAve. This blog aggressively and comprehensively covers everything in the cloud computing space, offering reviews, news and insights to its readers.

Follow: @CloudAve | Read the blog: cloudave.com

Cloud Computing

Cloud Computing - Chirag Mehta

For readers who want a strategic point of view on cloud computing and social media, look no further than this blog written by Chirag Mehta, a SAP employee. Mehta analyzes startups and shares insightful wisdom on how cloud services impact IT.

Follow: @chirag_mehta | Read the blog: cloudcomputing.blogspot.com

 

CloudTweaks

Cloud Tweaks

Established in 2009, CloudTweaks aims to deliver news, interviews and analysis for cloud computing enthusiasts. The site boasts contributions from CEOs, CTOs, bloggers and entrepreneurs, so people of all levels are welcome to the discussion.

Follow: @cloudtweaks | Read the blog: cloudtweaks.com

 

Cutter Consortium

Cutter Consortium

The Cutter Consortium is an IT advisory firm, and its blog is a potpourri of contributions from IT analysts and experts, sharing their wisdom and insight with readers. This is definitely a reliable place for gathering credible opinions in IT.

Follow: @cuttertweets | Read the blog: blog.cutter.com

 

Data Center Knowledge

Data Center Knowledge

Managing and optimizing the data center is probably at the top of every IT leader’s list of priorities. Thankfully, Rich Miller, editor of Data Center Knowledge, takes data centers as seriously as his readers do.

Follow: @datacenter | Read the blog: datacenterknowledge.com

 

DoubleCloud

Steven Jin

Pulling its name from the duality of the public and private cloud approaches, Steve Jin, a VMware employee, writes about virtualization for VMware architects and developers. For those looking for a virtualization/cloud blog that gets down to the nitty gritty, DoubleCloud is it.

Follow: @sjin2008 | Read the blog: doublecloud.org

 

Exchange Server Pro

Exchange Server Pro

Managing the Exchange server for a company can feel like Mission Impossible at times. The team at Exchange Server Pro, however, is here to help with solutions, tutorials and tips for Exchange Server professionals of all levels.

Follow: @exchservpro | Read the blog: exchangeserverpro.com

 

Face2Fujitsu

Face2Fujitsu

Solving problems and building solutions — that’s what the team at Fujitsu hopes to offer with its Face2Fujitsu blog. The blog humanizes the company by sharing posts from different employees within the company that touch on the robust technology and products that have made Fujitsu famous.

Follow: @Fujitsu_TS | Read the blog: blog.ts.fujitsu.com/face2fujitsu

 

Fountainhead

Fountainhead Ken Oestreich

Ken Oestreich, a cloud and virtualization marketer for EMC and the writer behind Fountainhead, has a wealth of knowledge when it comes to IT. He compares and contrasts cloud computing with real-world examples and pushes his readers to challenge their limits on everything from data, storage and beyond.

Follow: @Fountnhead | Read the blog: fountnhead.blogspot.com

 

Gartner Blog Network

Gartner

How many IT analysts does it take to screw in a light bulb? Whatever the answer is, there have to be enough of them over at the Gartner Blog Network to do the job. The renowned IT research firm is a stable of great thinkers, and its blog network opens the door to these insightful minds.

Follow: @gartner_inc | Read the blog: blogs.gartner.com

 

GottaBeMobile

Gotta Be Mobile

With the dawn of the mobile computing age, the days of being tethered to a desk or workstation are long behind us. Gotta Be Mobile lives its mantra by digging into the mobile world and offering news, reviews and sneak peeks at the latest in mobile technology.

Follow: @gottabemobile | Read the blog: gottabemobile.com

 

Greg’s Server and StorageIO Blog

Greg Schulz

Greg Schulz is an expert in storage technologies. As a well-known author and speaker, Greg has spoken at length about IT best practices. On his blog, readers can expect real-world tips, practical insights and informed opinions on storage, virtualization and more.

Follow: @storageio | Read the blog: storageioblog.com

 

Hu’s Blog

Hu Yoshida

As VP and CTO at Hitachi Data Systems, Hu Yoshida’s words carry weight. He digs into the meat and potatoes of storage and virtualization for the company, offering his point of view on emerging trends and technologies.

Follow: @hdscorp | Read the blog: blogs.hds.com/hu

 

Inside System Storage

Tony Inside Storage Blog

An acknowledged authority on storage software, systems and services, Tony Pearson, an IBM managing consultant, hosts an insider’s discussion on all things storage on this official IBM blog. He also provides on-the-ground reporting and commentary at conferences and events.

Follow: @az990tony | Read the blog: Inside System Storage

 

Mastering SharePoint with Bob Mixon

Bob Mixon

If you need a hand with deploying, managing or troubleshooting your company’s instance of SharePoint, Bob Mixon can probably help. As an experienced SharePoint expert, Bob offers valuable resources and insight to his readers. Bookmark this one — you’ll be visiting often.

Follow: @BobMixon | Read the blog: bobmixon.com

 

Microsoft Security Response Center

MSRC

Hackers and malware creators should beware. Microsoft takes IT security seriously and if you needed proof of that, look no further than the Microsoft Security Response Center. This official security blog from Microsoft issues bulletins, statistics and updates that keep IT staffers in the loop.

Follow: @msftsecresponse | Read the blog: blogs.technet.com/b/msrc/

 

Naked Security

Talk about putting yourself out there — this IT security blog hosted by Sophos is smart, hip and sharp in its coverage of malware, hacking and data security. Definitely worth a daily read.

Follow: @sophoslabs | Read the blog: nakedsecurity.sophos.com

 

Open Port IT Community

Intel Open Port IT

Big ideas can produce big results. The idea behind this Intel-hosted open exchange is to share thoughts on IT for the betterment of the field. Intel experts post on a wide spectrum of topics, including cloud, data center optimization and security.

Follow: @intelopenport | Read the blog: communities.intel.com/community/openportit

 

OS X Daily

OS X Daily

As Apple’s influence and innovation in the computing space pushes forward, more and more businesses are integrating Apple products into their IT infrastructure. Although not affiliated with Apple, OS X Daily covers everything Apple and offers news and useful tips to its readers.

Follow: @osxdaily | Read the blog: osxdaily.com

 

PandaLabs Blog

Panda Labs

Panda Security is a pioneer in cloud-based security solutions, and they leverage this expertise for the company blog, Panda Labs. Offering IT security alerts, advisories and in-depth reports on the state of security, this is a must read for security buffs.

Follow: @Panda_Security | Read the blog: pandalabs.pandasecurity.com

 

Rational Survivability

Rational Survivability

If you like your IT blogs with a splash of wit, Brazilian jiu-jitsu and dragon art, Chris Hoff’s Rational Survivability blog is the perfect mixture of informative and humorous with its posts on security, virtualization and cloud computing.

Follow: @beaker | Read the blog: rationalsurvivability.com/blog

 

Real Business at Xerox

Xerox

Although the blog is run by and hosted by Xerox, it covers more than just company product offerings, adding anecdotal stories on office productivity into the mix.

Follow: @XEROX | Read the blog: realbusinessatxerox.blogs.xerox.com

 

Scott Lowe

Lowe is a CTO for EMC, and his blog is a must-read for those who’d like to pick the brain of a veteran IT professional. Some of his posts get deep in the weeds, while others are more casual, technical advice stories.

Follow: @scott_lowe | Read the blog: blog.scottlowe.org

 

Sean Daniel

Sean Daniel

Looking for ways to start up, troubleshoot or optimize your servers? Sean Daniel has plenty of tips and tricks for IT staff working on Microsoft’s Small Business Server and Windows Home Server.

Follow: @seandaniel | Read the blog: sbs.seandaniel.com

 

The Security Catalyst

Security Catalyst

Ever feel like IT security is a little too removed from humanity to be effective? Michael Santarcangelo did, which is why he founded the Security Catalyst, a blog that aims to connect the dots between people and IT security.

Follow: @catalyst | Read the blog: securitycatalyst.com/blog

 

Small Biz Go Mobile

Small Biz Go Mobile

Mobile computing is everywhere. As the consumerization of IT makes its way into the enterprise, Small Biz Go Mobile, an AT&T-sponsored blog, documents how small businesses are integrating their systems with mobile technologies.

Follow: @smbizgomobile | Read the blog: marioarmstrong.com/smallbizgomobile

 

Small Business Labs

Small Biz Labs

Businesses produce a lot of data, but they’re often too busy doing business to sit back and sift through it all on their own. Small Business Labs takes the time to aggregate, analyze and research trends in small business, sharing its findings with readers.

Follow: @smallbizlabs | Read the blog: smallbizlabs.com

 

Small Biz Technology

Ramon Ray Small Biz Technology

Dedicated to educating and informing small businesses about technology and the impact that it can have on their businesses, Small Biz Technology is a handy bookmark for questions about utilizing social media for business.

Follow: @ramonray | Read the blog: smallbiztechnology.com

 

Stephen Foskett, Pack Rat

This IT blog is a buffet of IT information. Though he’s particularly knowledgeable about storage, Stephen Foskett’s posts cover networking and social media, and offer product reviews as well.

Follow: @sfoskett | Read the blog: blog.fosketts.net

 

Tenable Network Security

Tenable Security

There are a lot of great IT security blogs out there. Some of the better ones happen to be published by the vendors themselves. Tenable Network Security’s is one of them. The company’s network security podcast, in particular, is a highlight.

Follow: @TenableSecurity | Read the blog: blog.tenablesecurity.com

 

Threatpost

Threat Post

The team at IT security vendor Kaspersky Lab knows their stuff. And they share their knowledge with the world at Threatpost, issuing warnings about dangerous malware and worst practices that can leave your IT vulnerable to attack.

Follow: @threatpost | Read the blog: threatpost.com

 

Toshiba Telephone Systems

Toshiba

Getting the right fit when it comes to your business’s telephone systems can be a daunting task. Toshiba, a leader in telephony, brings together employees and experts to offer tips, advice and case studies on telephony, VoIP and other phone system products.

Follow: @toshibaphonesys | Read the blog: blog.telecom.toshiba.com

 

Tripwire Blog

Tripwire

This informed and measured blog from IT security vendor Tripwire offers insights, best practices and tips on IT security.

Follow: @tripwireinc | Read the blog: tripwire.com/blog

 

Uncommon Sense Security

Jack Daniel

Written by IT security veteran Jack Daniel, the tone of the blog is as strong as a stiff shot of whiskey. It’s a great read for those who like the unvarnished truth from time to time.

Follow: @jack_daniel | Read the blog: blog.uncommonsensesecurity.com

 

VMware Security & Compliance

VMWare

As a leader in the virtualization space, VMware is also, logically, a leader in the security and compliance of virtualization as well. This blog is updated regularly with announcements, advice and insights. A good read for companies seeking more security as they explore virtualization.

Follow: @VMwareBlogs | Read the blog: blogs.vmware.com/security

 

VMware Videos

David Davis VMWare videos

This blog from a training consultant provides video tutorials that will guide IT pros through the steps of managing and troubleshooting virtual environments — a reliable source of information on all things pertaining to VMware virtualization.

Follow: @davidmdavis | Read the blog: vmwarevideos.com

 

Websense News and Views

Websense

Published by information security vendor Websense, this blog offers profiles, news and views on the latest trends in IT security. If you have a question about something fishy that’s spreading on Facebook or Twitter, these guys will likely have it covered.

Follow: @websense | Read the blog: Websense News and Views

 

The Windows Blog

Windows Team blog

This blog provides its readers with official information on — you guessed it — all things Windows. This includes the mobile phone OS Windows Phone 7, as well as the desktop versions of Windows 7 and the future version, Windows 8.

Follow: @windowsblog | Read the blog: windowsteamblog.com

 

Wingspan

Brocade Wingspan

High performance is the name of the game for Brocade, a networking hardware and software vendor. On this blog, company executives offer their insights on data centers and highly available networks.

Follow: @BRCDcomm | Read the blog: Wingspan

 

The Wisdom Of Clouds

James Urquhart

Acting as a sage of sorts in the crowded cloud-computing field, James Urquhart writes with authority and clarity on the issues and trends emerging in the cloud. The updates aren’t as frequent as other blogs, but when he writes, it’s always worth reading.

Follow: @jamesurquhart | Read the blog: news.cnet.com/wisdom-of-clouds

 

Women 2.0

Women 2.0

Let’s face it, the IT world can sometimes seem like a boys’ club. Women 2.0, however, aims to inspire, support and encourage women to contribute and work in IT, particularly by getting involved in startups.

Follow: @women2 | Read the blog: women2.org

 

The World According to Mitch

Mitch Garvis

Mitch Garvis, an IT pro trainer, knows a lot about a lot of things. And he shares his multitude of thoughts, opinions, rants and ideas on the world of IT professionals in a loose and personal style on his blog.

Follow: @mgarvis | Read the blog: garvis.ca

 

Yellow Bricks

Duncan Yellow Bricks

This blog on virtualization written by Duncan Epping, a VMware employee, dives deep into the strategic and tactical applications of virtualization. The blog gets its name from an Arctic Monkeys song, “Old Yellow Bricks” — solid but flexible at the same time, Yellow Bricks let you build almost anything, he explains.

Follow: @DuncanYB | Read the blog: yellowbricks.com

Tech and Web Design

Six Revisions – Tips, tools, and great lists of design resources.
Smashing Magazine – Perfect site for the designer newbie, full of great tips and tutorials.
Noupe –   roundups (like “50 best free icon sets”), and constantly bookmark this site.
Hongkiat –  with a slightly broader focus in design.
I Love Typograhy – Love fonts and typefaces? Can’t beat this site.
Design Observer – More tips, tricks, and tutorials.
Swiss Miss – The musings of a designer, with a heavy focus on the funky and quirky bits of the design world.
How-Tos and Reviews
MakeUseOf – Endless resource of Top 10 Lists, and geeky hacks you might want to try.
gHacks – Deeper cuts in tech than MUO, but still great for news, tips, and tutorials.
Lost in Technology – Much more approachable than the above sites, it’s a great blog to wade into without much knowledge required.
Mac AppStorm – The best Mac apps on the planet get showcased here.
Web AppStorm –  but with Web apps.
FreelanceSwitch – Tips, tools, tricks and help for anyone living the freelance lifestyle (more and more of us these days).
40Tech – Encountering tech, particularly geared toward those over 40, but really useful for anyone.
Commentators
Daring Fireball – John Gruber is the smartest man on the planet when it comes to Apple.
Pogue’s Posts – David Pogue’s funny, smart, and a great representative of the common man.
Scripting News – Dave Winer’s as important to the tech landscape as anyone (he’s the godfather of RSS, among other things), and his thoughts on any subject are a must-read.
Search Engine Land – Danny Sullivan knows his stuff when it comes to search—and there’s a lot more to it than you might think.
All Things D – A group of thinkers from the Wall Street Journal, all discussing, analyzing and talking tech. .
Dustin Curtis – love the way his site looks more than anything, but he’s a great observer of the world of blogging, design, and art.
MinimalMac – Mostly a links roundup, but a phenomenal resource for anyone looking to make their Mac work for them.
Robert Scoble – The blog of author, tech evangelist, & Rackspace employee Robert Scoble.
Tips, Tricks and Hackery
Lifehacker – The grand poobah of “little things to make your life more productive, more efficient, and more awesome” blogs.
Digital Inspiration – Amit is clever, easy to understand, and full of cool and interesting tips for everyone.
Unclutterer – Much-needed help for getting the crap out of our way so we can get important things done.
HackCollege – Tons of useful tips on hacking college and succeeding in school, but with ideas useful for anyone.
Smarterware – Gina, the founder of Lifehacker, took to Smarterware to share more great tricks, and never disappoints.
Lifehack – Somewhat broader in its thinking than Lifehacker, but a great place to find tips to make every little piece of your life work a little better.
Switched – All things geek, particularly the culture of techies that is forming.
Make Magazine – Do awesome stuff with your stuff. That should totally be their tagline.
Cool Stuff
Gizmodo – Gadgets, gadgets, gadgets!
Engadget –  more gadgets!
Boy Genius Report – They’re full of rumors and leaks, and are almost always right.
jkOnTheRun – Arguably the most seasoned gadget-heads out there, they’re a smart, thoughtful, and objective resource for all things gadget and mobile.
For Funsies
Xkcd – Want to know how nerds think? Read this comic. That’s exactly it.
Boing Boing – The interesting, quirky, strange, and weird things in the tech world.
Neatorama – Awesome things, that’s their only criteria for inclusion, and they stick to it well.

Gartner 2012 Hype Cycle for Emerging Technologies

Big data, 3D printing, activity streams, Internet TV, Near Field Communication (NFC) payment, cloud computing and media tablets are some of the fastest-moving technologies identified in Gartner Inc.’s 2012 Hype Cycle for Emerging Technologies.

Gartner analysts said that these technologies have moved noticeably along the Hype Cycle since 2011, while consumerization is now expected to reach the Plateau of Productivity in two to five years, down from five to 10 years in 2011. Bring your own device (BYOD), 3D printing and social analytics are some of the technologies identified at the Peak of Inflated Expectations in this year’s Emerging Technologies Hype Cycle (see Figure 1).

Gartner’s 2012 Hype Cycle Special Report provides strategists and planners with an assessment of the maturity, business benefit and future direction of more than 1,900 technologies, grouped into 92 areas. New Hype Cycles this year include big data, the Internet of Things, in-memory computing and strategic business capabilities.

The Hype Cycle graphic has been used by Gartner since 1995 to highlight the common pattern of overenthusiasm, disillusionment and eventual realism that accompanies each new technology and innovation. The Hype Cycle Special Report is updated annually to track technologies along this cycle and provide guidance on when and where organizations should adopt them for maximum impact and value.

The Hype Cycle for Emerging Technologies report is the longest-running annual Hype Cycle, providing a cross-industry perspective on the technologies and trends that senior executives, CIOs, strategists, innovators, business developers and technology planners should consider in developing emerging-technology portfolios.

“Gartner’s Hype Cycle for Emerging Technologies targets strategic planning, innovation and emerging technology professionals by highlighting a set of technologies that will have broad-ranging impact across the business,” said Jackie Fenn, vice president and Gartner fellow. “It is the broadest aggregate Gartner Hype Cycle, featuring technologies that are the focus of attention because of particularly high levels of hype, or those that Gartner believes have the potential for significant impact.”

“The theme of this year’s Hype Cycle is the concept of ‘tipping points.’ We are at an interesting moment, a time when many of the scenarios we’ve been talking about for a long time are almost becoming reality,” said Hung LeHong, research vice president at Gartner. “The smarter smartphone is a case in point. It’s now possible to look at a smartphone and unlock it via facial recognition, and then talk to it to ask it to find the nearest bank ATM. However, at the same time, we see that the technology is not quite there yet. We might have to remove our glasses for the facial recognition to work, our smartphones don’t always understand us when we speak, and the location-sensing technology sometimes has trouble finding us.”

Figure 1. Hype Cycle for Emerging Technologies, 2012


Source: Gartner (August 2012)

Although the Hype Cycle presents technologies individually, Gartner encourages enterprises to consider the technologies in sets or groupings, because so many new capabilities and trends involve multiple technologies working together. Often, one or two technologies that are not quite ready can limit the true potential of what is possible. Gartner refers to these technologies as “tipping point technologies” because, once they mature, the scenario can come together from a technology perspective.

Some of the more significant scenarios, and the tipping point technologies, need to mature so that enterprises and governments can deliver new value and experiences to customers and citizens include:

Any Channel, Any Device, Anywhere — Bring Your Own Everything

The technology industry has long talked about scenarios in which any service or function is available on any device, at anytime and anywhere. This scenario is being fueled by the consumerization trend that is making it acceptable for enterprise employees to bring their own personal devices into the work environment. The technologies and trends featured on this Hype Cycle that are part of this scenario include BYOD, hosted virtual desktops, HTML5, the various forms of cloud computing, silicon anode batteries and media tablets. Although all these technologies and trends need to mature for the scenario to become the norm, HTML 5, hosted virtual networks and silicon anode batteries are particularly strong tipping point candidates.

Smarter Things

A world in which things are smart and connected to the Internet has been in the works for more than a decade. Once connected and made smart, things will help people in every facet of their consumer, citizen and employee lives. There are many enabling technologies and trends required to make this scenario a reality. On the 2012 Hype Cycle, Gartner has included autonomous vehicles, mobile robots, Internet of Things, big data, wireless power, complex-event processing, Internet TV, activity streams, machine-to-machine communication services, mesh networks: sensor, home health monitoring and consumer telematics. The technologies and trends that are the tipping points to success include machine-to-machine communication services, mesh networks: sensor, big data, complex-event processing and activity streams.

Big Data and Global Scale Computing at Small Prices

This broad scenario portrays a world in which analytic insight and computing power are nearly infinite and cost-effectively scalable. Once enterprises gain access to these resources, many improved capabilities are possible, such as better understanding customers or better fraud reduction. The enabling technologies and trends on the 2012 Hype Cycle include quantum computing, the various forms of cloud computing, big data, complex-event processing, social analytics, in-memory database management systems, in-memory analytics, text analytics and predictive analytics. The tipping point technologies that will make this scenario accessible to enterprises, governments and consumers include cloud computing, big data and in-memory database management systems.

The Human Way to Interact With Technology

This scenario describes a world in which people interact a lot more naturally with technology. The technologies on the Hype Cycle that make this possible include human augmentation, volumetric and holographic displays, automatic content recognition, natural-language question answering, speech-to-speech translation, big data, gamification, augmented reality, cloud computing, NFC, gesture control, virtual worlds, biometric authentication methods and speech recognition. Many of these technologies have been “emerging” for multiple years and are starting to become commonplace, however, a few stand out as tipping point technologies including natural-language question answering and NFC.

What Payment Could Really Become

This scenario envisions a cashless world in which every transaction is an electronic one. This will provide enterprises with efficiency and traceability, and consumers with convenience and security. The technologies on the 2012 Hype Cycle that will enable parts of this scenario include NFC payment, mobile over the air (OTA) payment and biometric authentication methods. Related technologies will also impact the payment landscape, albeit more indirectly. These include the Internet of Things, mobile application stores and automatic content recognition. The tipping point will be surpassed when NFC payment and mobile OTA payment technologies mature.

The Voice of the Customer Is on File

Humans are social by nature, which drives a need to share — often publicly. This creates a future in which the “voice of customers” is stored somewhere in the cloud and can be accessed and analyzed to provide better insight into them. The 2012 Hype Cycle features the following enabling technologies and trends: automatic content recognition, crowdsourcing, big data, social analytics, activity streams, cloud computing, audio mining/speech analytics and text analytics. Gartner believes that the tipping point technologies are privacy backlash and big data.

3D Print It at Home

In this scenario, 3D printing allows consumers to print physical objects, such as toys or housewares, at home, just as they print digital photos today. Combined with 3D scanning, it may be possible to scan certain objects with a smartphone and print a near-duplicate. Analysts predict that 3D printing will take more than five years to mature beyond the niche market.

Additional information is available in “Gartner’s Hype Cycle for Emerging Technologies, 2012″ at http://www.gartner.com/hypecycles. The Special Report includes a video in which Ms. Fenn provides more details regarding this year’s Hype Cycles, as well as links to the 92 Hype Cycle reports.

 ,

Nano computers and quantum computing

Nanocomputer is the logical name for a computer smaller than the microcomputer, which is smaller than the minicomputer. (The minicomputer is called “mini” because it was a lot smaller than the original (mainframe) computers.) More technically, it is a computer whose fundamental parts are no bigger than a few nanometers. For comparison, the smallest part of current state-of-the-art microprocessors measures 28 nm as of March 24, 2012. No commercially available computers that are named nanocomputers exist at this date, but the term is used in science and science fiction.

There are several ways nanocomputers might be built, using mechanical, electronic, biochemical, or quantum technology. It was argued in 2007 that is unlikely that nanocomputers will be made out of semiconductor transistors (Microelectronic components that are at the core of all modern electronic devices), as they seem to perform significantly less well when shrunk to sizes under 100 nanometers; however, as of 2011 it is projected that 22 nm lithography devices will ship before 2012.

This is based on nano technology and, in fact, one of its major application. Nano components of such machine make it efficient to use.

Nano-Computing

The history of computer technology has involved a sequence of changes from gears to relays to valves to transistors to integrated circuits and so on. Today’s techniques can fit logic gates and wires a fraction of a micron wide onto a silicon chip. Soon the parts will become smaller and smaller until they are made up of only a handful of atoms. At this point the laws of classical physics break down and the rules of quantum mechanics take over, so the new quantum technology must replace and/or supplement what we presently have. It will support an entirely new kind of computation with new algorithms based on quantum principles.

Presently our digital computers rely on bits, which, when charged, represent on, true, or 1. When not charged they become off, false, or 0. A register of 3 bits can represent at a given moment in time one of eight numbers (000,001,010,…,111). In the quantum state, an atom (one bit) can be in two places at once according to the laws of quantum physics, so 3 atoms (quantum bits or qubits) can represent all eight numbers at any given time. So for x number of qubits, there can be 2x numbers stored. Parallel processing can take place on the 2x input numbers, performing the same task that a classical computer would have to repeat 2x times or use 2x processors working in parallel. In other words a quantum computer offers an enormous gain in the use of computational resources such as time and memory. This becomes mind boggling when you think of what 32 qubits can accomplish.

This all sounds like another purely technological process. Classical computers can do the same computations as quantum computers, only needing more time and more memory. The catch is that they need exponentially more time and memory to match the power of a quantum computer. An exponential increase is really fast, and available time and memory run out very quickly.

Quantum computers can be programed in a qualitatively new way using new algorithms. For example, we can construct new algorithms for solving problems, some of which can turn difficult mathematical problems, such as factorization, into easy ones. The difficulty of factorization of large numbers is the basis for the security of many common methods of encryption. RSA, the most popular public key cryptosystem used to protect electronic bank accounts gets its security from the difficulty of factoring very large numbers. This was one of the first potential uses for a quantum computer.

“Experimental and theoretical research in quantum computation is accelerating world-wide. New technologies for realising quantum computers are being proposed, and new types of quantum computation with various advantages over classical computation are continually being discovered and analysed and we believe some of them will bear technological fruit. From a fundamental standpoint, however, it does not matter how useful quantum computation turns out to be, nor does it matter whether we build the first quantum computer tomorrow, next year or centuries from now. The quantum theory of computation must in any case be an integral part of the world view of anyone who seeks a fundamental understanding of the quantum theory and the processing of information.” ( Center for Quantum Computation)

In 1995 there was a $100 bet made to create the impossible within 16 years, the world’s first nanometer supercomputer. This resulted in the NanoComputer Dream Team, and utilizes the internet to gather talent from every scientific field and from all over the world, amateur and professional. Their deadline: November 1, 2011.

quantum computer

The massive amount of processing power generated by computer manufacturers has not yet been able to quench our thirst for speed and computing capacity. In 1947, American computer engineer Howard Aiken said that just six electronic digital computers would satisfy the computing needs of the United States. Others have made similar errant predictions about the amount of computing power that would support our growing technological needs. Of course, Aiken didn’t count on the large amounts of data generated by scientific research, the proliferation of personal computers or the emergence of the Internet, which have only fueled our need for more, more and more computing power.

As Moore’s Law states, the number of transistors on a microprocessor continues to double every 18 months, the year 2020 or 2030 will find the circuits on a microprocessor measured on an atomic scale. And the logical next step will be to create quantum computers, which will harness the power of atoms and molecules to perform memory and processing tasks. Quantum computers have the potential to perform certain calculations significantly faster than any silicon-based computer.

Scientists have already built basic quantum computers that can perform certain calculations; but a practical quantum computer is still years away. In this article, you’ll learn what a quantum computer is and just what it’ll be used for in the next era of computing.

You don’t have to go back too far to find the origins of quantum computing. While computers have been around for the majority of the 20th century, quantum computing was first theorized less than 30 years ago, by a physicist at the Argonne National Laboratory. Paul Benioff is credited with first applying quantum theory to computers in 1981. Benioff theorized about creating a quantum Turing machine. Most digital computers, like the one are based on the Turing Theory.

The Turing machine, developed by Alan Turing in the 1930s, is a theoretical device that consists of tape of unlimited length that is divided into little squares. Each square can either hold a symbol (1 or 0) or be left blank. A read-write device reads these symbols and blanks, which gives the machine its instructions to perform a certain program. Does this sound familiar? Well, in a quantum Turing machine, the difference is that the tape exists in a quantum state, as does the read-write head. This means that the symbols on the tape can be either 0 or 1 or a superposition of 0 and 1; in other words the symbols are both 0 and 1 (and all points in between) at the same time. While a normal Turing machine can only perform one calculation at a time, a quantum Turing machine can perform many calculations at once.

Today’s computers, like a Turing machine, work by manipulating bits that exist in one of two states: a 0 or a 1. Quantum computers aren’t limited to two states; they encode information as quantum bits, or qubits, which can exist in superposition. Qubits represent atoms, ions, photons or electrons and their respective control devices that are working together to act as computer memory and a processor. Because a quantum computer can contain these multiple states simultaneously, it has the potential to be millions of times more powerful than today’s most powerful supercomputers.

This superposition of qubits is what gives quantum computers their inherent parallelism. According to physicist David Deutsch, this parallelism allows a quantum computer to work on a million computations at once, while your desktop PC works on one. A 30-qubit quantum computer would equal the processing power of a conventional computer that could run at 10 teraflops (trillions of floating-point operations per second). Today’s typical desktop computers run at speeds measured in gigaflops (billions of floating-point operations per second).

Quantum computers also utilize another aspect of quantum mechanics known as entanglement. One problem with the idea of quantum computers is that if you try to look at the subatomic particles, you could bump them, and thereby change their value. If you look at a qubit in superposition to determine its value, the qubit will assume the value of either 0 or 1, but not both (effectively turning your spiffy quantum computer into a mundane digital computer). To make a practical quantum computer, scientists have to devise ways of making measurements indirectly to preserve the system’s integrity. Entanglement provides a potential answer. In quantum physics, if you apply an outside force to two atoms, it can cause them to become entangled, and the second atom can take on the properties of the first atom. So if left alone, an atom will spin in all directions. The instant it is disturbed it chooses one spin, or one value; and at the same time, the second entangled atom will choose an opposite spin, or value. This allows scientists to know the value of the qubits without actually looking at them.

 Some recent advancements in the field of quantum computing.

So-called quantum computers are designed to quickly crunch numbers that would take a person a lifetime or longer—for instance, mapping trillions of amino acids for futuristic drug cures or making sense of the avalanche of public data we create daily. So what can you get by putting one to use for your company, as Lockheed Martin (LMT) has since it bought the world’s first corporate model from D-Wave Systems in 2011? (A few weeks ago, Google (GOOG) bought the second.) The aerospace and security giant has been operating its device at the University of Southern California’s Quantum Computation Center for the past 18 months.

Quantum computing uses the quantum nature of matter, the atoms themselves, as computing devices. Normal computer architecture is based on the bit—represented either as a one or a zero. The quantum computer is programmed so that the input is initially both zero and one.

Because on the quantum level you’re able to program the atoms to represent all possible input combinations, and to do so simultaneously. That means when you run an algorithm, all possible input combinations are tested at once. With a regular computer you’d have to serially cycle through every possible input combination to arrive at your solution, meaning it would take longer than the age of the universe to complete the most complicated calculations.
To solve hugely enormous, complex problems in a reasonable amount of time. That has been intriguing to scientists at Lockheed Martin since the first inception of quantum computing.
LM started with smaller tasks at first to better understand the capabilities of the machine. But one area of interest is in complex systems such as software verification and validation. The development of any large computer system integration initiative involves a lot of software. Validating the performance of that software is vital, but it’s a very time-consuming and often very expensive undertaking. LM taken the software and cast it as a problem for the quantum computer to address. It scans through the switches and combinations in the software code and makes sure that it’s performing in a way that  expected it to.
The quantum computer as a machine that frees up time and money. If you can validate and verify software in a single series of tests, then the money and time saved can be used elsewhere. It becomes an innovation enabler. LM already in the era of quantum computing. The academic question of how quantum it is, and how entangled the “qubits” (quantum bits) are, really doesn’t concern . What is the concerned with is how it can help  reduce costs, make better systems, and accelerate innovation.. Quantum computing is a practical tool for extremely complex predictive analysis, and machine learning where you need to assess many variables and many patterns and test models against it. This is relevant in the area of drug discovery, cybersecurity, business, finance, investment, health care, logistics, and planning. There are a number of business applications—those that involve solving complex optimization problems—that today would be too difficult to address with silicon computing.
Conventional computing is not going to go away. You wouldn’t want to use a quantum computer to balance your checkbook. Quantum computing best addresses those exceedingly complex computational problems—in drug discovery, for example, when you have trillions of combinations of amino acids to cycle through to find that single protein. That’s a job for quantum computing. That’s the power of it, in a nutshell.

Qubit Control

Computer scientists control the microscopic particles that act as qubits in quantum computers by using control devices.

  • Ion traps use optical or magnetic fields (or a combination of both) to trap ions.
  • Optical traps use light waves to trap and control particles.
  • Quantum dots are made of semiconductor material and are used to contain and manipulate electrons.
  • Semiconductor impurities contain electrons by using “unwanted” atoms found in semiconductor material.
  • Superconducting circuits allow electrons to flow with almost no resistance at very low temperatures.
  • Today’s Quantum Computers

    Quantum computers could one day replace silicon chips, just like the transistor once replaced the vacuum tube. But for now, the technology required to develop such a quantum computer is beyond our reach. Most research in quantum computing is still very theoretical.

    The most advanced quantum computers have not gone beyond manipulating more than 16 qubits, meaning that they are a far cry from practical application. However, the potential remains that quantum computers one day could perform, quickly and easily, calculations that are incredibly time-consuming on conventional computers. Several key advancements have been made in quantum computing in the last few years. Let’s look at a few of the quantum computers that have been developed.

    1998

    Los Alamos and MIT researchers managed to spread a single qubit across three nuclear spins in each molecule of a liquid solution of alanine (an amino acid used to analyze quantum state decay) or trichloroethylene (a chlorinated hydrocarbon used for quantum error correction) molecules. Spreading out the qubit made it harder to corrupt, allowing researchers to use entanglement to study interactions between states as an indirect method for analyzing the quantum information.

    2000

    In March, scientists at Los Alamos National Laboratory announced the development of a 7-qubit quantum computer within a single drop of liquid. The quantum computer uses nuclear magnetic resonance (NMR) to manipulate particles in the atomic nuclei of molecules of trans-crotonic acid, a simple fluid consisting of molecules made up of six hydrogen and four carbon atoms. The NMR is used to apply electromagnetic pulses, which force the particles to line up. These particles in positions parallel or counter to the magnetic field allow the quantum computer to mimic the information-encoding of bits in digital computers.

    Researchers at IBM-Almaden Research Center developed what they claimed was the most advanced quantum computer to date in August. The 5-qubit quantum computer was designed to allow the nuclei of five fluorine atoms to interact with each other as qubits, be programmed by radio frequency pulses and be detected by NMR instruments similar to those used in hospitals (see How Magnetic Resonance Imaging Works for details). Led by Dr. Isaac Chuang, the IBM team was able to solve in one step a mathematical problem that would take conventional computers repeated cycles. The problem, called order-finding, involves finding the period of a particular function, a typical aspect of many mathematical problems involved in cryptography.

    2001

    Scientists from IBM and Stanford University successfully demonstrated Shor’s Algorithm on a quantum computer. Shor’s Algorithm is a method for finding the prime factors of numbers (which plays an intrinsic role in cryptography). They used a 7-qubit computer to find the factors of 15. The computer correctly deduced that the prime factors were 3 and 5.

    2005

    The Institute of Quantum Optics and Quantum Information at the University of Innsbruck announced that scientists had created the first qubyte, or series of 8 qubits, using ion traps.

    2006

    Scientists in Waterloo and Massachusetts devised methods for quantum control on a 12-qubit system. Quantum control becomes more complex as systems employ more qubits.

    2007

    Canadian startup company D-Wave demonstrated a 16-qubit quantum computer. The computer solved a sudoku puzzle and other pattern matching problems. The company claims it will produce practical systems by 2008. Skeptics believe practical quantum computers are still decades away, that the system D-Wave has created isn’t scaleable, and that many of the claims on D-Wave’s Web site are simply impossible (or at least impossible to know for certain given our understanding of quantum mechanics).

    If functional quantum computers can be built, they will be valuable in factoring large numbers, and therefore extremely useful for decoding and encoding secret information. If one were to be built today, no information on the Internet would be safe. Our current methods of encryption are simple compared to the complicated methods possible in quantum computers. Quantum computers could also be used to search large databases in a fraction of the time that it would take a conventional computer. Other applications could include using quantum computers to study quantum mechanics, or even to design other quantum computers.

    But quantum computing is still in its early stages of development, and many computer scientists believe the technology needed to create a practical quantum computer is years away. Quantum computers must have at least several dozen qubits to be able to solve real-world problems, and thus serve as a viable computing method.

India Inc – Generation Y Characteristics (Twenty plus)

Gen Y called Golden generation – Traits

a) Cannot have enough freedom at work

b) Constantly seeking time and space from seniors

They mine for information from sources,

c) Vociferous displaying IQ and Abilities.,

Qualities includes impatient, experiment driven, inquisitive, enthusiastic, technology – savvy, self aware, individualistic.

They should be occupied with career challenges. We need to treat them equals, no command and control works for them. We need to have transparent conversations. we need to be consistent with the work expectations..

Be one up on knowledge to get respect.

constantly re- skill – ing and re-tooling themselves over their jobs.

They identify hobbies for them to delve in – eg : writing, music, reading  etc. Find their groove in the hobby areas and develop on them.

d) Wants quick rewards, provides quick solutions, demands quick solutions

e)Wants to have lofty and un-realistic goals

f) Are high on energy, strong in self beliefs and faith in themselves

g) Want work – life balance from day 1. They are for sabaticals for 2 months in 12 months in a year.

h) Career growth, is critical, money is given

i) Want to control their life even in work, they are not for apology, sometimes they brag about their abilities. Their engagement is assured.

Future directions, Risks and Challenges – Cloud computing

Cloud computing can be defined as a pool of virtualised computing resources that allows users to gain access to applications and data in a web-based environment on demand. This post explains the various cloud architecture and usage models that exist and some of the benefits in using cloud services. It seeks to contribute to a better understanding of the emerging threat landscape created by cloud computing, with a view to identifying avenues for risk reduction. Three avenues for action are identified, in particular, the need for a culture of cyber security to be created through the development of effective public-private partnerships; the need for   privacy regime to be reformed to deal with the issues created by cloud computing and the need for cyber-security researchers to find ways in which to mitigate existing and new security risks in the cloud computing environment.

Cloud computing is now firmly established in the information technology landscape and its security risks need to be mapped and addressed at this critical stage in its development.

A computer’s operating system, applications and data are typically installed and stored in the ‘traditional’ computing environment. In a cloud computing environment, individuals and businesses work with applications and data stored and/or maintained on shared machines in a web-based environment rather than physically located in the home of a user or a corporate environment. Lew Tucker, Vice President and Chief Technology Officer of Cloud Computing at Sun Microsystems, explained that cloud computing is ‘the movement of application services onto the Internet and the increased use of the Internet to access a wide variety of services traditionally originating from within a company’s data center’ (Creeger 2009: 52). For example, web-based applications such as Google’s Gmail™ can be accessed in real time from an Internet-connected machine anywhere in the world.

Use of cloud services creates a growing interdependence among both public and private sector entities and the individuals served by these entities. This post provides a snapshot of risk areas specific to cloud services and those that apply more generally in an online environment which clients of cloud service providers should be aware of.

Cloud computing

It is not clear when the term cloud computing was first coined. For example, Bartholomew (2009), Bogatin (2006) and several others suggested that ‘cloud computing’ terminology was, perhaps, first coined by Google™ Chief Executive Eric Schmidt in 2006. Kaufman (2009: 61) suggests that cloud computing terminology ‘originates from the telecommunications world of the 1990s, when providers began using virtual private network (VPN) services for data communication’. Desisto, Plummer and Smith (2008: 1) state that ‘[t]he first SaaS [Software as a Service] offerings were delivered in the late 1990s…[a]lthough these offerings weren’t called cloud computing’. There is, however, agreement on the definition of cloud computing.

The National Institute of Standards and Technology defines cloud computing as

a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (eg networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction (Mell 2009: 9).

Architectures and deployment models

Cloud architectures can be broadly categorised into:

Infrastructure as a Service (IaaS) is the foundation of cloud services. It provides clients with access to server hardware, storage, bandwidth and other fundamental computing resources. For example, Amazon EC2 allows individuals and businesses to rent machines preconfigured with selected operating systems on which to run their own applications.

Platform as a Service (PaaS) builds upon IaaS and provides clients with access to the basic operating software and optional services to develop and use software applications (eg database access and payment service) without the need to buy and manage the underlying computing infrastructure. For example, Google App Engine allows clients to run their web applications (ie software that can be accessed using a web browser such as Internet Explorer over the internet) on Google’s infrastructure.

Software as a Service (SaaS), builds upon the underlying IaaS and PaaS provides clients with integrated access to software applications. For example, Oracle SaaS Platform allows independent software vendors to build, deploy and manage SaaS and cloud-based applications using a licensing economic model. Here, users purchase a license and support for components of the Oracle SaaS Platform on a monthly basis.

Cloud services can be used in a private, public, community/managed or hybrid setting (Cloud Security Alliance 2009). Privately-hosted cloud services are generally considered a safer but more costly option than services using a shared-tenancy setting (ie data from different clients stored on a single physical machine). In line with this, the US Government recently announced an initiative ‘to offer cloud-based services that are hosted in private data centers and which could be used to handle more sensitive data’ (McMillan 2009: np).

In a community/managed setting, tenancy can either be single (dedicated) or shared and the IT infrastructure is either managed by the organisation or a third-party cloud service provider. The main difference between hybrid cloud services and other cloud services is that the former ‘is a composition of two or more clouds (private, community, or public) that remain unique entities but are bound together by standardized or proprietary technology that enables data and application portability’ (Mell & Grance 2009: 13).

Benefits

Cloud computing provides a scalable online environment which facilitates the ability to handle an increased volume of work without impacting on the performance of the system. Cloud computing also offers significant computing capability and economy of scale that might not otherwise be affordable to businesses, especially small and medium enterprises (SMEs) that may not have the financial and human resources to invest in IT infrastructure. Advantages include:

  • Capital costs—SMEs can provide unique services using large-scale resources from cloud service providers and ‘add or remove capacity from their IT infrastructure to meet peak or fluctuating service demands while paying only for the actual capacity used’ (Sotomayor et al. 2009: 14) on a ‘pay-as-you-go’ economic model.
  • Running costs—it can also be significantly cheaper to rent added server space for a few hours at a time rather than maintain proprietary servers. Rental prices for Amazon Elastic Compute Cloud (EC2), for example, are between US$0.10–1.00 an hour. Businesses do not have to worry about upgrading their resources whenever a new version of the application is available. Businesses can also base their services in the data centres of bigger enterprises or host their IT infrastructure in locations offering the lowest cost.

Advantages of using cloud services can also go beyond cost savings as cloud computing allows clients to:

  • Avoid the expense and time-consuming task of installing and maintaining hardware infrastructure and software applications; and
  • Allow for the rapid provisioning and use of services to clients by optimising their IT infrastructure (Lewin 2009).

External hosting of applications and storage also ensures redundancy and business continuity in the event of a site failure.

Service level agreements

To ensure guarantees from cloud service providers for service delivery, businesses using cloud computing services typically enter into service level agreements (SLAs) with the cloud service providers. Although SLAs vary between businesses and cloud service providers, they typically include the required/agreed service level through quality of service parameters, the level of service availability, the indication of the security measures adopted by the cloud service provider and the rates of the services.

Cloud computing risks

Attacks targeting shared-tenancy environment

A virtual machine (VM) is the software implementation of a computer that runs its own operating system and application as if it was a physical machine (VMWare 2009). Multiple VMs can concurrently run different software applications on different operating system environments on a single physical machine. This reduces hardware costs and space requirements.

In a shared-tenancy cloud computing environment, data from different clients can be hosted on separate VMs but reside on a single physical machine. This provides maximum flexibility. Software applications running in one VM should not be able to impact or influence software running in another VM. An individual VM should be unaware of the other VMs running in the environment as all actions are confined to its own address space.

In a recent study, a team of computer scientists from the University of California, San Diego and Massachusetts Institute of Technology examined the widely-used Amazon EC2 services. They found that ‘it is possible to map the internal cloud infrastructure, identify where a particular target VM is likely to reside, and then instantiate new VMs until one is placed co-resident with the target’ (Ristenpart et al. 2009: 199). This demonstrated that the research team were able to load their eavesdropping software onto the same servers hosting targeted websites (Hardesty 2009). By identifying the target VMs, attackers can potentially monitor the cache (a small allotment of high-speed memory used to store frequently-used information) in order to steal data hosted on the same physical machine (Hardesty 2009). Such an attack is also known as a side-channel attack.

The findings from this research may only be a proof-of-concept at this stage, but it raises concerns about the possibility of cloud computing servers being a central point of vulnerability that can be criminally exploited. The Cloud Security Alliance, for example, listed this as one of the top threats to cloud computing.

Attacks have surfaced in recent years that target the shared technology inside Cloud Computing environments. Disk partitions, CPU caches, GPUs, and other shared elements were never designed for strong compartmentalization. As a result, attackers focus on how to impact the operations of other cloud customers, and how to gain unauthorized access to data. (Cloud Security Alliance 2010: 11)

VM-based malware

Vulnerabilities in VMs can be exploited by malicious code (malware) such as VM-based rootkits designed to infect both client and server machines in cloud services. Rootkits are cloaking technologies usually employed by other malware programs to abuse compromised systems by hiding files, registry keys and other operating system objects from diagnostic, antivirus and security programs. For example, in April 2009, a security researcher pointed out how a critical vulnerability in VMware’s VM display function could be exploited to run malware, which allows an attacker ‘to read and write memory on the ”host” operating system [OS]’ (Keizer 2009: np).

VM-based rootkits, as pointed out by Price (2008: 27), could be used by attackers to ‘gain complete control of the underlying OS without the compromised OS being aware of their existence…[and] are especially dangerous because they also control all hardware interfaces. Once the VM-based rootkits are installed on the machine, they can “view keystrokes, network packets, disk state, and memory state, while the compromised OS remains oblivious”’.

Botnet hosting

Bot malware typically takes advantage of system vulnerabilities and software bugs or hacker-installed backdoors that allow malicious code to be installed on machines without the owners’ consent or knowledge. They then load themselves into computers often for nefarious purposes. Machines infected with bot malware are then turned into ‘zombies’ and can be used as remote attack tools or to form part of a botnet under the control of the botnet controller.

Zombies are compromised machines waiting to be activated by their command and control (C&C) servers. The C&C servers are often machines that have been compromised and arranged in a distributed structure to limit traceability.

Cybercriminals could potentially abuse cloud services to operate C&C servers to carry out distributed denial-of-service (DDoS) attacks, which are attacks from multiple sources targeting specific websites by flooding a web server with repeated messages, tying up the system and denying access to legitimate users, as well as other cyber criminal activities. In December 2009, for example, a ‘new wave of a Zeus bot (Zbot) variant was spotted taking advantage of Amazon EC2’s cloud-based services for its C&C…functionalities’ (Ferrer 2009: np).

Launch pad for brute force and other attacks

There have also been suggestions that the virtualised infrastructure can be used as a launching pad for new attacks. A security consultant recently suggested that it may be possible to abuse cloud computing services to launch a brute force attack (a strategy used to break encrypted data by trying all possible decryption key or password combinations) on various types of passwords. Using Amazon EC2 as an example, the consultant estimated that based on the ‘hourly fees Amazon charges for its EC2 web service, it would cost more than [US]$1.5m to brute force a 12-character password containing nothing more than lower-case letters a through z…[but] an 11-character code costs less than [US]$60,000 to crack, and a 10-letter phrase costs less than [US]$2,300’ (Goodin 2009: np).

Although it is still relatively expensive to perform brute force online password-guessing attacks (also known as online dictionary attacks), this could have broad implications for systems using password-based authentication. It may not take long for attackers to design a more practical and cheaper mechanism that exploits cloud services as a launch pad for other attacks, a threat also identified by the Cloud Security Alliance (2010: 8):

Future areas of concern include password and key cracking, DDOS, launching dynamic attack points, hosting malicious data, botnet command and control, building rainbow tables, and CAPTCHA solving farms.

Data availability (business continuity)

A major risk to business continuity in the cloud computing environment is loss of internet connectivity (that could occur in a range of circumstances such as natural disasters) as businesses are dependent on the internet access to their corporate information. In addition, if vulnerability is identified in a particular service provide by the cloud service provider, the business may have to terminate all access to the cloud service provider until they could be assured that the vulnerability has been rectified.

There are also concerns that the seizure of a data-hosting server by law enforcement agencies may result in the unnecessary interruption or cessation of unrelated services whose data is stored on the same physical machine.

In a recent example, ‘FBI agents [reportedly] seized computers from a data center at 2323 Bryan Street in Dallas, Texas, attempting to gather evidence in an ongoing investigation of two men and their various companies accused of defrauding AT&T and Verizon for more than US$6 million’ (Lemos 2009: np). This resulted in the unintended consequence of disrupting the continuity of businesses whose data and information are hosted on the seized hardware.

[For] LiquidMotors, a company that provides inventory management to car dealers, the servers held its client data and hosted its managed inventory services. The FBI seizure of the servers in the data center rack effectively shut down the company, which filed a lawsuit against the FBI the same day to get the data back (Lemos 2009: np)

While the above example may be an isolated case, it raised concerns about unauthorised access to seized data not related to the warrant, which can result in the unintended disclosure of data to unwanted parties, particularly in authoritarian countries.

There had been a number of reported incidents of cloud services being taken offline due to DDoS attacks (see Metz 2009). Although DDoS attacks already existed, the cloud computing environment is a new attack sector that may have a more widespread impact on internet users.

The security measures adopted by different cloud service providers varies. If ‘a cybercriminal can identify the [cloud service] provider whose vulnerabilities are the easiest to exploit, then this entity becomes a highly visible target. The lack of security associated with this single entity threatens the entire cloud in which it resides’ (Kaufman 2009: 63).

Rogue clouds

Just like entrepreneurs, cybercriminals and organised crime groups are always on the lookout for new markets and with the rise of cloud computing, a new sector for exploitation now exists. Rogue cloud service providers based in jurisdictions with lax cybercrime legislation can provide confidential hosting and data storage services for a usually steep fee. Such services could potentially be abused by organised crime groups to store and distribute criminal data (eg child abuse materials for commercial purposes) to avoid the scrutiny of law enforcement agencies.

Hosting confidential business data with cloud service providers involves the transfer of a considerable amount of management control to cloud service providers that usually results in diminished control over security arrangements. There is the risk of rogue providers mining the data for secondary uses such as marketing and reselling the mined data to other businesses. A June 2009 email survey of 220 decision-makers in US organisations with more than 1,000 employees highlighted similar concerns. In the survey, 40.5 percent of the respondents agreed/strongly agreed that ‘[t]he trend toward using SaaS and cloud computing solutions in the enterprise seriously increases the risk of data leakage’ (Proofpoint 2009: 24).

Unfortunately, clients (especially SMEs) are often less aware of the risks and may not have an easy way of determining whether a particular cloud service provider is trustworthy. Tim Watson, head of the computer forensics and security group at De Montfort University remarked that ‘one provider may offer a wonderfully secure service and another may not, if the latter charges half the price, the majority of organisations will opt for it as they have no real way of telling the difference’ (Everett 2009: 7).

Other potential risks

Espionage risks

There is increasing pressure for nation-states to develop cyber-offensive capabilities. The next wave of cyber-security threats could potentially be targeted attacks aimed at specific government agencies and organisations, or individuals within enterprises including cloud service providers. For example, Google and several Gmail accounts belonging to Chinese and Tibetan activists have reportedly been targeted (Google 2010; Helft & Markoff 2010).

Foreign intelligence services and industrial spies may not disrupt the normal functioning of an information system as they are mainly interested in obtaining information relevant to vital national or corporate interests. They do so through clandestine entry into computer systems and networks as part of their information-gathering activities.

Cloud service providers may be compelled to scan or search data of interest to ‘national security’ and to report on, or monitor, particular types of transactional data as these data may be subject to the laws of the jurisdiction in which the physical machine is located (Gellman 2009). In addition, overseas cloud service providers may not be legally obliged to notify the clients (owners of the data) about such requests.

Regulation and governance

The privacy and confidentiality risks faced by businesses that use cloud services also depend to a large extent on the terms of service and privacy policy established by the cloud service providers. Failure to comply with data protection legislation may lead to administrative, civil and criminal sanctions. Data confidentiality and privacy ‘risks may be magnified when the cloud provider has reserved the right to change its terms and policies at will’ (Gellman 2009: 6).

Some cloud service providers argue that such jurisdictional issues may be capable of resolution contractually via SLAs and the like. Clients using cloud services could include clauses in their SLAs that indicate the law governing the SLA, the choice of the competent court in case of disputes arising from the interpretation and the execution of the contract. The Cloud Security Alliance (2009: 28) also suggested that clients of cloud services should require their providers ‘to deliver a comprehensive list of the regulations and statutes that govern the site and associated services and how compliance with these items is executed’.

Businesses should ensure that SLAs and other legally-binding contractual arrangements with cloud service providers comply with applicable regulatory obligations (eg privacy laws) and industry standards, as they may be liable for breaching these regulations even when the data being breached is held or processed by the cloud service provider.

Determining the law of the jurisdiction in which the SLA is held is an important issue. It may not, however, be as simple as examining the contractual laws that govern operations of cloud service providers to determine which jurisdiction’s laws apply in any particular case. Gellman (2009: 19) pointed out that ‘[t]he user may be unaware of the existence of a second-degree provider or the actual location of the user’s data…[and] it may be impossible for a casual user to know in advance or with certainty which jurisdiction’s law actually applies to information entrusted to a cloud provider’.

Businesses should continue to conduct due diligence on cloud service providers, have a comprehensive compliance framework and ensure that protocols are in place to continuously monitor and manage cloud service providers, offshore vendors and their associated outsourcing relationships. This would ensure businesses have a detailed understanding of the data storage information to maintain some degree of oversight and ensure that an acceptable authentication and access mechanism in place to meet their privacy and confidentiality needs.

The way forward-Culture of security

Vulnerabilities in a particular cloud service or cloud computing environment can potentially be exploited by criminals and actors with malicious intent. However, no single public or private sector entity ‘owns’ the issue of cyber security. There is, arguably, a need to take a broader view and promote transparency and confidence building between cloud service providers, businesses and government agencies using cloud services as well as between government and law enforcement agencies.

In addition, an effective cyber-security policy should be comprehensive and encompass all (public and private sector) entities. The public and private sectors should continue to work together to:

  • Identify and prioritise current and emerging risk areas;
  • Develop and validate effective measures and mitigation controls. This would involve establishing a standard that mandates certain minimum requirements to ensure an adequate level of electronic information exchange security; and
  • Ensure that these strategies are implemented and updated at the respective level.

It is reasonable to assume that higher levels of security can only be achieved at higher marginal costs. To encourage a culture of security, governments could incubate and create market incentives for cloud service providers to integrate security into the software and hardware and system development life cycle. An improved level and type of security is likely to increase the marginal cost of security violations, which in turn will reduce the marginal benefits of cybercrime.

An example is to create an environment conducive for cloud service providers to achieve marketing and competitive advantages if they offer products and services with higher levels and more innovative types of security to assist in combating cyber exploitation. This could be accomplished through government tenders. Dealing with insider threats should also be incorporated into the software/hardware and system development life cycle.

Cloud computing security – Insider Threats

Cloud computing security is ripe with new opportunities for future research, including cloud-related insider threats.
we do not believe the nature of the insider will change due to cloud computing’s impact, but the opportunities for attacks will broaden. Researchers should take note of these new opportunities and respond
accordingly to prevent, detect, and respond to new cloud- related insider attacks. Some important future research topics are:
• Socio-technical approach to insider threats
• Predictive models
• Identifying cloud-based indicators
• Virtualization and hypervisors
• Awareness and reporting
• Normal user behavior analysis
• Policy integration

GEO Informatics, Fencing-Positioning-Tagging-caching

Geoinformatics, also: Geographic information science (GIS) and Geographic information technology (GIT), is the science and the technology which develops and uses information science infrastructure to address the problems of geography, geosciences and related branches of engineering.

http://www.opengeospatial.org/

http://www.icaci.org/

http://www.isprs.org/

http://www.iugg.org/

Overview

Geoinformatics has been described as “the science and technology dealing with the structure and character of spatial information, its capture, its classification and qualification, its storage, processing, portrayal and dissemination, including the infrastructure necessary to secure optimal use of this information”or “the art, science or technology dealing with the acquisition, storage, processing production, presentation and dissemination of geoinformation”.

Geomatics is a similarly used term which encompasses geoinformatics, but geomatics focuses more so on surveying. Geoinformatics has at its core the technologies supporting the processes of acquiring, analyzing and visualizing spatial data. Both geomatics and geoinformatics include and rely heavily upon the theory and practical implications of geodesy.

Geography and earth science increasingly rely on digital spatial data acquired from remotely sensed images analyzed by geographical information systems (GIS) and visualized on paper or the computer screen.[5]

Geoinformatics combines geospatial analysis and modeling, development of geospatial databases, information systems design, human-computer interaction and both wired and wireless networking technologies. Geoinformatics uses geocomputation and geovisualization for analyzing geoinformation.

Branches of geoinformatics include:

  • Cartography
  • Geodesy
  • Geographic Information Systems
  • Global Navigation Satellite Systems
  • Photogrammetry
  • Remote sensing
  •  Web mapping

Applications

Many fields benefit from geoinformatics, including urban planning and land use management, in-car navigation systems, virtual globes, public health, local and national gazetteer management, environmental modeling and analysis, military, transport network planning and management, agriculture, meteorology and climate change, oceanography and coupled ocean and atmosphere modelling, business location planning, architecture and archeological reconstruction, telecommunications, criminology and crime simulation, aviation and maritime transport. The importance of the spatial dimension in assessing, monitoring and modelling various issues and problems related to sustainable management of natural resources is recognized all over the world. Geoinformatics becomes very important technology to decision-makers across a wide range of disciplines, industries, commercial sector, environmental agencies, local and national government, research, and academia, national survey and mapping organisations, International organisations, United Nations, emergency services, public health and epidemiology, crime mapping, transportation and infrastructure, information technology industries, GIS consulting firms, environmental management agencies), tourist industry, utility companies, market analysis and e-commerce, mineral exploration, etc. Many government and non government agencies started to use the spatial data for managing their day to day activities.

A geo-fence is a virtual perimeter for a real-world geographic area..

A geo-fence could be dynamically generated—as in a radius around a store or point location. Or a geo-fence can be a predefined set of boundaries, like school attendance zones or neighborhood boundaries. Custom-digitized geofences are also in use.

When the location-aware device of a location-based service (LBS) user enters or exits a geo-fence, the device receives a generated notification. This notification might contain information about the location of the device. The geofence notice might be sent to a mobile telephone or an email account.

Geofencing, used with child location services, can notify parents when a child leaves a designated area

Geofencing is a critical element to telematics hardware and software. It allows users of the system to draw zones around places of work, customers sites and secure areas. These geo-fences when crossed by an equipped vehicle or person can trigger a warning to the user or operator via SMS or Email.

Other applications include sending an alert if a vehicle is stolen and notifying rangers when wildlife stray into farmland.

Geofencing in a security strategy model provides security to wireless local area networks. This is done by using predefined borders, e.g., an office space with borders established by positioning technology attached to a specially programmed server. The office space becomes an authorized location for designated users and wireless mobile devices.

The Global Positioning System (GPS) is a space-based satellite navigation system that provides location and time information in all weather, anywhere on or near the Earth, where there is an unobstructed line of sight to four or more GPS satellites. It is maintained by the United States government and is freely accessible to anyone with a GPS receiver.

The GPS program provides critical capabilities to military, civil and commercial users around the world. In addition, GPS is the backbone for modernizing the global air traffic system.

The GPS project was developed in 1973 to overcome the limitations of previous navigation systems,[1] integrating ideas from several predecessors, including a number of classified engineering design studies from the 1960s. GPS was created and realized by the U.S. Department of Defense (DoD) and was originally run with 24 satellites. It became fully operational in 1994.

Advances in technology and new demands on the existing system have now led to efforts to modernize the GPS system and implement the next generation of GPS III satellites and Next Generation Operational Control System (OCX).[2] Announcements from the Vice President and the White House in 1998 initiated these changes. In 2000, U.S. Congress authorized the modernization effort, referred to as GPS III.

In addition to GPS, other systems are in use or under development. The Russian GLObal NAvigation Satellite System (GLONASS) was in use by only the Russian military, until it was made fully available to civilians in 2007. There are also the planned European Union Galileo positioning system, Chinese Compass navigation system, and Indian Regional Navigational Satellite System

Geotagging (also written as GeoTagging) is the process of adding geographical identification metadata to various media such as a geotagged photograph or video, websites, SMS messages, QR Codes[1] or RSS feeds and is a form of geospatial metadata. This data usually consists of latitude and longitude coordinates, though they can also include altitude, bearing, distance, accuracy data, and place names.

Geotagging can help users find a wide variety of location-specific information. For instance, one can find images taken near a given location by entering latitude and longitude coordinates into a suitable image search engine. Geotagging-enabled information services can also potentially be used to find location-based news, websites, or other resources.[2] Geotagging can tell users the location of the content of a given picture or other media or the point of view, and conversely on some media platforms show media relevant to a given location.

The related term geocoding refers to the process of taking non-coordinate based geographical identifiers, such as a street address, and finding associated geographic coordinates (or vice versa for reverse geocoding). Such techniques can be used together with geotagging to provide alternative search techniques.

Geocaching is an outdoor sporting activity in which the participants use a Global Positioning System (GPS) receiver or mobile device[2] and other navigational techniques to hide and seek containers, called “geocaches” or “caches”, anywhere in the world.

A typical cache is a small waterproof container containing a logbook where the geocacher enters the date they found it and signs it with their established code name. Larger containers such as plastic storage containers (Tupperware or similar) or ammunition boxes can also contain items for trading, usually toys or trinkets of little value. Geocaching shares many aspects with benchmarking, trigpointing, orienteering, treasure-hunting, letterboxing, and waymarking.

Geocaches are currently placed in over 200 countries around the world and on all seven continents, including Antarctica,[3] and the International Space Station.[4] After more than 12 years of activity there are over 1.7 million active geocaches published on various websites. There are over 5 million geocachers worldwide.

WEB Feature services

The Open Geospatial Consortium Web Feature Service Interface Standard (WFS) provides an interface allowing requests for geographical features across the web using platform-independent calls. One can think of geographical features as the “source code” behind a map, whereas the WMS interface or online mapping portals like Google Maps return only an image, which end-users cannot edit or spatially analyze. The XML-based GML furnishes the default payload-encoding for transporting the geographic features, but other formats like shapefiles can also serve for transport. In early 2006, the OGC members approved the OpenGIS GML Simple Features Profile. This profile is designed to both increase interoperability between WFS servers and to improve the ease of implementation of the WFS standard.

The OGC membership defined and maintains the WFS specification. There are numerous commercial and open source implementations of the WFS interface standard, including an open source reference implementation, called GeoServer. A comprehensive list of WFS implementations can be found at the OGC Implemention

web Map service

A Web Map Service (WMS) is a standard protocol for serving georeferenced map images over the Internet that are generated by a map server using data from a GIS database.[2] The specification was developed and first published by the Open Geospatial Consortium in 1999.

Social Recognition of Employees in the social era

Farmville buffs zealously work their fields, buying seeds and equipment, harvesting and selling crops – all in an obsessive bid to earn points, win badges and rise to higher and higher levels. What has got them hooked to posting needed items, giving plants or animals and adding neighbors is the heart-warming recognition through the barrage of ‘likes’ and ‘comments’ that follow! In fact, the stupendous success of Facebook has proved that people simply love to share what’s going on in both their personal and professional lives across their social networks.

So, why not move these motivational techniques to an organizational platform to offer real-time recognition to employees 365 days a year, even on the go! Everyone wants to be appreciated for their contributions and achievements. So much so that an international strategic group, ‘Recognition Council’ has been formed to provide an awareness of how recognition and rewards, in their many forms, are part of an effective strategy for achieving better business performance! The traditional reward schemes and pay-for-performance incentives are no longer sufficient. The new generation craves constant feedback and praise, that too in a highly visible format.

Social recognition steps in as the new currency that fosters employee engagement, who cares what people think? Turns out … just about everybody. People love to be recognized, but they like it even better when they can share it throughout their company. That’s why we believe in the power of Social Employee Recognition – giving employees the power to share all of their great work company-wide though various features such as Live Recognition. There’s more! Employees can also share the good work they are doing on their external social networks as well such as Facebook, LinkedIn and Twitter. Give  people a chance to look like rock stars in front of their friends and make them envious of the cool place where they work. It’s more than just an ego boost for people – it’s a shortcut to employee engagement. motivation and retention. This subtle performance management works by adding a social interface to employee appreciation efforts that showcases good work in a public forum.

New technology tools of social media and mobile applications bring informal, immediate, frequent, interactive and visible-to-all element to employee recognition. Now employee efforts, inputs, skills, knowledge and accomplishments can easily be celebrated in the open. In short, move the offline pat-on-the-back to an online realm!

Simple posts like ‘Great job’, ‘Well done’ or even a ‘Thank you for your efforts’ on Facebook, Twitter, LinkedIn or more private company intranets is all that employees need.

The subsequent social feedback in the form of likes, comments, retweets and shares magnify the value of appreciation.

TPG Software topped an all-India survey of ‘Top Corporate Organizations for Best Practices in Rewards and Recognition’ by Edenred, a loyalty solution organization, in association with the Great Place to Work Institute. The highlight of TPG’s rewards program  is that managers personally congratulate people for a good job by writing personal notes about good performance on the company intranet as well as social and professional networks such as Facebook and LinkedIn.

American Express which ranked third, uses ‘RewardBlue’ – an internal intranet to deliver congratulatory messages to well-performing employees.

When employees feel that their efforts do matter and the effect is amplified with the congratulatory notes and approvals all around, the validation affects their self-perception and identity within the organization. The public recognition forges deeper connections, making them more supportive, loyal and eager to improve performance.

The ‘Social Recognition and Employees’ Organizational Support’ research thesis studied over 900 employees in service organisations. It surmises that social recognition contributes to increased self-respect, which means that employees make a greater effort to act in the company’s best interests.

Broadcasting day-to-day employee recognition stories that would have otherwise gone untold, not only encourages the appreciated behavior in the recipient, but also inspires other ‘viewers’ to achieve the exemplified behaviors that drive appreciation and success. The organization also gets an opportunity to epitomize the activities that are expected/liked/valued while reinforcing the desired corporate culture. The elated recipients will in turn tell the world (on social networking sites again) how their company recognizes good work and is a great place to work!

There is an added element of peer-to-peer recognition apart from the customary top-down method as literally anyone can appreciate anyone. This can be used by management in both performance reviews and to identify key talent as well. Eric Mosley, CEO of Globoforce, an employee recognition solutions provider, points out in a Harvard Business Review article, “Employees better understand what performance is desired on an on-going basis while managers can see first-hand an employee’s true performance, behaviors and influence.”

“The bigger business impact of social software in the HR context will be on the organization culture. Social software can alter the organizational fabric and culture, creating a more open and collaborative work environment,” observes Jeffrey Mann, Vice President at Gartner Research. It is easy; it is fast and even safe and secure when used on internal networking sites. The impact is astounding, the value is extraordinary and that too at a nominal cost of the internal technology platform. Apart from this, this innovative recognition is completely non-monetary!

Recognition Council research reveals, “While some incentive clients are ready to use public tools like Facebook and LinkedIn, many prefer to keep programmes behind the firewall; using mechanisms that often resemble Facebook and other social media venues.” Top companies are reaping the rewards by turning recognition into a business asset.

Symantec reported a 16% increase in employee engagement in less than year and KPMG a solid 165%. DHL, Discovery Channel and P&G are also using this medium very successfully. “What sets social recognition apart, is that it is unexpected. Whilst the overall monetary output from the organisation is relatively low, the impact is huge…employees don’t focus on the amount, they are just happy that someone has appreciated and acknowledged what they have done.

That is hugely powerful,” exclaims Sara Turner, Head of Employee Benefits and Wellbeing, KPMG.

Experts opine that combining regular employee recognition programs with social recognition efforts will lead to even better results.

The Edenred study states that top corporate organizations for best practices in rewards and recognition use a healthy mix of money rewards, non-monetary rewards and social recognition, based on survey data taken from more than 13,000 employees, and HR managers from more than 70 companies across 11 industries.

So, seize the opportunity to engage employees and take recognition to a new level to build an enviable employer brand.