Sub menu editing

Drop Down MenusCSS Drop Down MenuPure CSS Dropdown Menu

Wednesday, May 22, 2019

The High Luminosity Large Hadron Collider


The High Luminosity Large Hadron Collider (HL-LHC) is a major upgrade of the Large Hadron Collider (LHC) that is completed by 2026. This new design boosts the machine's luminosity by a factor of between five and seven, allowing 10 times more data to be accumulated, providing a better chance to see rare processes and improving statistically marginal measurements.

Luminosity is a way of measuring the performance of an accelerator: it is proportional to the number of collisions that occur in a given amount of time. The HL-LHC can perform detailed studies of the new particles observed at the LHC, such as the Higgs boson. It enables the observation of rare processes that were inaccessible at the previous sensitivity levels. More than 15 million Higgs bosons can be produced each year, for example, compared to the 1.2 million produced in 2011-2012.


The development of the HL-LHC depends on several technological innovations that are exceptionally challenging to researchers – such as cutting-edge Tesla superconducting magnets, very compact and ultra-precise superconducting cavities for beam rotation, and 300-metre-long high-power superconducting links with zero energy dissipation. Together, these upgrades help to advance and further refine the knowledge already gained from the Higgs boson and provide fresh insights into so-called "New Physics", a more fundamental and general theory than that of the Standard Model.

Tuesday, May 21, 2019

Self-learning software (SLS)


Building on the industry shift toward SaaS, a new trend in the software space is emerging that combines both SaaS and AI. Leading companies from Amazon, Google, Microsoft, and IBM have begun offering their AI infrastructure as a service to their clients. In other words, no longer is AI and machine learning accessible to only to software giants, now any company and developer can access online AI resources to build self-learning software (SLS).

At home, this could mean having an SLS system manage your future smart home, including tasks like pre-heating your home before you arrive or keeping track of groceries you need to buy. By the 2020s and into the 2030s, these SLS systems will play a vital role in the corporate, government, military, and consumer markets, gradually helping each improve their productivity and reduce waste of all kinds.
The only way the SaaS and SLS models work is if the Internet (or the infrastructure behind it) continues to grow and improve, alongside the computing and storage hardware that runs the ‘cloud' these SaaS/SLS systems operate on. The coming decade will see the next evolution, the next quantum leap in communication and interconnectivity, entirely intermediated by a range of future computer interfaces and it may just reshape what it means to be human.

Poking, pinching, and swiping at the air


As of 2018, smartphones have replaced standard mobile phones in much of the developed world. This means a large portion of the world is now familiar with the various tactile commands mentioned above. Through apps and games, smartphone users have learned a large variety of abstract skills to control the relative supercomputers sitting in their pockets.  It's these skills that will prepare consumers for the next wave of devices—devices that will allow us to more easily merge the digital world with our real-world environments. So let's take a look at some of the tools we'll use to navigate our future world.

Open-air gesture control as of 2018, we’re still in the micro-age of touch control. We still poke, pinch, and swipe our way through our mobile lives. But that touch control is slowly giving way to a form of open-air gesture control. For the gamers out there, your first interaction with this may have been playing overactive Nintendo Wii games or the Xbox Kinect games—both consoles use advanced motion-capture technology to match player movements with game avatars. 
Well, this tech isn't staying confined to video games and green screen filmmaking; it will soon enter the broader consumer electronics market. One striking example of what this might look like is a Google venture named Project Soli. Developers of this project use miniature radar to track the fine movements of your hand and fingers to simulate the poke, pinch, and swipe in open-air instead of against a screen. This is the kind of tech that will help make wearable easier to use, and thus more attractive to a wider audience.

Speaking to your virtual assistant


While we're slowly reimagining touch UI, a new and complementary form of UI is emerging that may feel even more intuitive to the average person: speech. Amazon made a cultural splash with the release of its artificially intelligent (AI) personal assistant system, Alexa, and the various voice-activated home assistant products it released alongside it. Google, the supposed leader in AI, rushed to follow suit with its own suite of home assistant products.

Whether you prefer Amazon's Alexa, Google's Assistant, iPhone's Siri, or Windows Cortana, these services are designed to let you interface with your phone or smart device and access the knowledge bank of the web with simple verbal commands, telling these ‘virtual assistants' what you want. It’s an amazing feat of engineering. And even while it’s not quite perfect, the technology is improving quickly. When you combine this falling error rate with the massive innovations happening with microchips and cloud computing (outlined in the upcoming series chapters), we can expect virtual assistants to become pleasantly accurate by 2020.

Even better, the virtual assistants currently being engineered will not only understand your speech perfectly, but they will also understand the context behind the questions you ask; they will recognize the indirect signals given off by your tone of voice; they will even engage in long-form conversations with you, Her-style. Overall, voice recognition based virtual assistants will become the primary way we access the web for our day-to-day informational needs.

Haptic holograms


The holograms we’ve all seen in person or in the movies tend to be 2D or 3D projections of light that show objects or people hovering in the air. What these projections all have in common is that if you reached out to grab them, you would only get a handful of air. That won’t be the case by the mid-2020s. New technologies (see examples: one and two) are being developed to create holograms you can touch (or at least mimic the sensation of touch, i.e. haptics). Depending on the technique used, be it ultrasonic waves or plasma projection, haptic holograms will open up an entirely new industry of digital products that we can use in the real world.

Think about it, instead of a physical keyboard, you can have a holographic one that can give you the physical sensation of typing, wherever you’re standing in a room. This technology is what will mainstream the Minority Report open-air interface and possibly end the age of the traditional desktop.

Imagine this: Instead of carrying around a bulky laptop, you could one day carry a small square wafer (maybe the size of a thin external hard drive) that would project a touchable display screen and keyboard hologram. Taken one step further, imagine an office with only a desk and a chair, then with a simple voice command, an entire office projects itself around you—a holographic workstation, wall decorations, plants, etc. Shopping for furniture or decoration in the future may involve a visit to the app store along with a visit to Ikea.


Monday, May 20, 2019

Three-dimensional interface


Taking this open-air gesture control further along its natural progression, by the mid-2020s, we may see the traditional desktop interface—the trusty keyboard and mouse—slowly replaced by the gesture interface, in the same style popularized by the movie, Minority Report. In fact, John Underkoffler, UI researcher, science adviser, and inventor of the holographic gesture interface scenes from Minority Report.

It is currently working on the real-life version—a technology he refers to as a human-machine interface spatial operating environment. (He'll probably need to come up a handy acronym for that.) Using this technology, you will one day sit or stand in front of a large display and use various hand gestures to command your computer. It looks really cool (see link above), but as you might guess, hand gestures might be great for skipping TV channels,
pointing/clicking on links, or designing three-dimensional models, but they won’t work so well when writing long essays. That’s why as open-air gesture technology is gradually included into more and more consumer electronics, it will likely be joined by complementary UI features like advanced voice command and iris tracking technology. Yes, the humble, physical keyboard may yet survive into the 2020.

The online marketplace will become the new traditional business model


Marketplaces will continue to be the new department store. Marketplaces are continuing to thrive. In this year 2019 we will continue to see marketplaces take over as top revenue drivers in retail versus the traditional department store. Even though some retailers like Walmart or Target are working to innovate beyond their traditional business models, this won't be enough to save most department stores from slight (or in some cases, rapid) decline. Marketplace only charges a commission as they are providing sellers with a platform (with a huge network) to sell. Examples of online marketplace are Amazon.com, Flipkart.com, Hotels.com, etc


Online marketplaces are online and/or mobile platforms that act as virtual stores and environments connecting your service offering to consumers who require this service, instantly. They remove the physical barriers of time and place to allow transactions to happen securely online. Service providers and contractors must also understand the new business models driving market behaviour, and adopt new ways of working, to survive in a fast changing environment. Overall, the rise of the marketplace will significantly benefit consumers and established players who capitalise on the early opportunities presented by these services.



 Existing service providers must adapt to this new marketplace, or risk losing both market share, as well as their workforce. The convergence of anytime/anywhere technology and consumer willingness to engage with these platforms has created an environment ripe for further disruption across industries. Marketplace start-ups are testing the idea that any market can be disinter mediated via an online and/or mobile platform.



Categories

machine (16) human (15) medical (13) mobile (12) digital (11) business (10) city (10) internet (10) operate (10) computer (9) graphics (9) electronics (8) power (8) water (8) workplace (8) cloud (7) robots (7) space (7) webpage (7) class (6) vehicles (5) solar (4) automation (3) battery (3) car (3) data (3) television (3) camera (2) building (1) government (1) satellite (1)

Ads

Featured Post