Sub menu editing

Drop Down MenusCSS Drop Down MenuPure CSS Dropdown Menu

Tuesday, May 21, 2019

Self-learning software (SLS)


Building on the industry shift toward SaaS, a new trend in the software space is emerging that combines both SaaS and AI. Leading companies from Amazon, Google, Microsoft, and IBM have begun offering their AI infrastructure as a service to their clients. In other words, no longer is AI and machine learning accessible to only to software giants, now any company and developer can access online AI resources to build self-learning software (SLS).

At home, this could mean having an SLS system manage your future smart home, including tasks like pre-heating your home before you arrive or keeping track of groceries you need to buy. By the 2020s and into the 2030s, these SLS systems will play a vital role in the corporate, government, military, and consumer markets, gradually helping each improve their productivity and reduce waste of all kinds.
The only way the SaaS and SLS models work is if the Internet (or the infrastructure behind it) continues to grow and improve, alongside the computing and storage hardware that runs the ‘cloud' these SaaS/SLS systems operate on. The coming decade will see the next evolution, the next quantum leap in communication and interconnectivity, entirely intermediated by a range of future computer interfaces and it may just reshape what it means to be human.

Poking, pinching, and swiping at the air


As of 2018, smartphones have replaced standard mobile phones in much of the developed world. This means a large portion of the world is now familiar with the various tactile commands mentioned above. Through apps and games, smartphone users have learned a large variety of abstract skills to control the relative supercomputers sitting in their pockets.  It's these skills that will prepare consumers for the next wave of devices—devices that will allow us to more easily merge the digital world with our real-world environments. So let's take a look at some of the tools we'll use to navigate our future world.

Open-air gesture control as of 2018, we’re still in the micro-age of touch control. We still poke, pinch, and swipe our way through our mobile lives. But that touch control is slowly giving way to a form of open-air gesture control. For the gamers out there, your first interaction with this may have been playing overactive Nintendo Wii games or the Xbox Kinect games—both consoles use advanced motion-capture technology to match player movements with game avatars. 
Well, this tech isn't staying confined to video games and green screen filmmaking; it will soon enter the broader consumer electronics market. One striking example of what this might look like is a Google venture named Project Soli. Developers of this project use miniature radar to track the fine movements of your hand and fingers to simulate the poke, pinch, and swipe in open-air instead of against a screen. This is the kind of tech that will help make wearable easier to use, and thus more attractive to a wider audience.

Speaking to your virtual assistant


While we're slowly reimagining touch UI, a new and complementary form of UI is emerging that may feel even more intuitive to the average person: speech. Amazon made a cultural splash with the release of its artificially intelligent (AI) personal assistant system, Alexa, and the various voice-activated home assistant products it released alongside it. Google, the supposed leader in AI, rushed to follow suit with its own suite of home assistant products.

Whether you prefer Amazon's Alexa, Google's Assistant, iPhone's Siri, or Windows Cortana, these services are designed to let you interface with your phone or smart device and access the knowledge bank of the web with simple verbal commands, telling these ‘virtual assistants' what you want. It’s an amazing feat of engineering. And even while it’s not quite perfect, the technology is improving quickly. When you combine this falling error rate with the massive innovations happening with microchips and cloud computing (outlined in the upcoming series chapters), we can expect virtual assistants to become pleasantly accurate by 2020.

Even better, the virtual assistants currently being engineered will not only understand your speech perfectly, but they will also understand the context behind the questions you ask; they will recognize the indirect signals given off by your tone of voice; they will even engage in long-form conversations with you, Her-style. Overall, voice recognition based virtual assistants will become the primary way we access the web for our day-to-day informational needs.

Haptic holograms


The holograms we’ve all seen in person or in the movies tend to be 2D or 3D projections of light that show objects or people hovering in the air. What these projections all have in common is that if you reached out to grab them, you would only get a handful of air. That won’t be the case by the mid-2020s. New technologies (see examples: one and two) are being developed to create holograms you can touch (or at least mimic the sensation of touch, i.e. haptics). Depending on the technique used, be it ultrasonic waves or plasma projection, haptic holograms will open up an entirely new industry of digital products that we can use in the real world.

Think about it, instead of a physical keyboard, you can have a holographic one that can give you the physical sensation of typing, wherever you’re standing in a room. This technology is what will mainstream the Minority Report open-air interface and possibly end the age of the traditional desktop.

Imagine this: Instead of carrying around a bulky laptop, you could one day carry a small square wafer (maybe the size of a thin external hard drive) that would project a touchable display screen and keyboard hologram. Taken one step further, imagine an office with only a desk and a chair, then with a simple voice command, an entire office projects itself around you—a holographic workstation, wall decorations, plants, etc. Shopping for furniture or decoration in the future may involve a visit to the app store along with a visit to Ikea.


Categories

machine (16) human (15) medical (13) mobile (12) digital (11) business (10) city (10) internet (10) operate (10) computer (9) graphics (9) electronics (8) power (8) water (8) workplace (8) cloud (7) robots (7) space (7) webpage (7) class (6) vehicles (5) solar (4) automation (3) battery (3) car (3) data (3) television (3) camera (2) building (1) government (1) satellite (1)

Ads

Featured Post