FILE PHOTO: Prompts on how to use Amazon's Alexa personal assistant are seen in an Amazon ‘experience centre’ in Vallejo, California, U.S., May 8, 2018. REUTERS/Elijah Nouvelage
(Reuters) - Amazon.com IncAMZN.Oon Thursday said it shifted part of the computing for its Alexa voice assistant to its own custom-designed chips, aiming to make the work faster and cheaper while moving it away from chips supplied by Nvidia CorpNVDA.O.
When users of devices such as Amazon’s Echo line of smart speakers ask the voice assistant a question, the query is sent to one of Amazon’s data centers for several steps of processing. When Amazon’s computers spit out an answer, that reply is in a text format that must be translated into audible speech for the voice assistant.
Amazon previously handled that computing using chips from Nvidia but now the “majority” of it will happen using its own “Inferentia” computing chip. First announced in 2018, the Amazon chip is custom designed to speed up large volumes of machine learning tasks such as translating text to speech or recognizing images.
Cloud computing customers such as Amazon, Microsoft CorpMSFT.Oand Alpahbet Inc'sGOOGL.OGoogle have become some of the biggest buyers of computing chips, driving booming data center sales at Intel CorpINTC.O, Nvidia and others.
But major technology companies are increasingly ditching traditional silicon providers to design their own chips. Apple on Tuesday introduced its first Mac computers with its own central processors, moving away from Intel chips. [nL1N2HW1WF}
Amazon said the shift to the Infertia chip for some of its Alexa work has resulted in 25% better latency, which is a measure of speed, at a 30% lower cost.
Amazon has also said that “Rekognition,” its cloud-based facial recognition service, has started to adopt its own Inferentia chips. However, the company did not say which chips the facial recognition service had previously used or how much of the work had shifted to its own chips.
The service has come under scrutiny from civil rights groups because of its use by law enforcement. Amazon in June put a one-year moratorium its use by police after the killing of George Floyd.
Reporting by Stephen Nellis in San Francisco; Editing by Tom Brown
November 13, 2020 at 12:31AM
https://ift.tt/32C7EaP
Amazon shifts some voice assistant, face recognition computing to its own chips - Reuters
Graphic shows a close view of an inhibitory synapse in 3D. Credit: TAO Changlu
Synapses are specialized brain structures where learning and memory occur. The efficient transmission of synaptic signals relies on the delicate structure and complex molecular composition of the synapses. However, the small size (several hundred nanometers in diameter) and heterogeneous nature of the synapses pose significant challenges in direct observation of the molecules inside synapses.
Based on a proposed processing technique for in situ cryo-electron tomography, researchers from University of Science and Technology of China (USTC) and Shenzhen Institutes of Advanced Technology (SIAT) of the Chinese Academy of Sciences (CAS) became the first scientists to observe individual GABAA receptors and their organization on the synaptic membrane, endowing the brain's ability for information processing.
"The advance of this study comes from the in situ cryo-electron microscopy, a method that preserves the cells in native states and has an order of magnitude of higher resolution compared to the super-resolution optical microscopy," said Tao Changlu, postdoctoral fellow from USTC and the study's co-first author, now associate investigator at SIAT.
This image processing technique is able to automatically locate the membrane proteins in their cellular context. "To ensure that we detect every receptor on the postsynaptic membrane, we oversampled the synaptic membrane and classified all the sampled 3-D images without any template" said Liu Yuntao, graduate student from USTC and the study's co-first author, now postdoctoral fellow at UCLA. "We even used the negative control that sampling the presynaptic membrane to validate our observation."
Once the receptors were detected, researchers suddenly realized that the receptors are not randomly distributed on the membrane: they tend to keep the same 11 nm 'social distancing' from each other. Intriguingly, the receptors can rotate freely, even though constrained by distance.
"The social distancing among receptors could arise from their interactions with scaffolding molecules—gephyrins," said Bi Guoqiang, professor of neuroscience at USTC and senior author of the paper.
The scaffolding molecules form a 5-nm thick density sheet to support and regulate GABAA receptors on the membrane. Together, they form an absorbing semi-ordered structure called a 'mesophasic assembly.'
A mesophasic state is in between the liquid and solid, which might be induced by the multivalent interaction between receptors and their scaffolding molecules and attract the readily-releasable vesicles containing neurotransmitters. The inhibitory synapses could store information by arranging the GABAA receptors in such a low-entropy Goldilocks state.
This semi-ordered structure differs from the previously proposed hexagonal lattice organization of GABAA receptors and gephyrins. Notably, each synapse tends to contain one mesophasic assembly, rather than multiple nano-domains as observed in excitatory synapses with super-resolution optical microscopy.
"This work represents the first nanometer-resolution observation at the inhibitory synaptic receptors and a critical step towards resolving the atomic details of the brain," said Zhou Hong, director of the Electron Imaging Center for NanoMachines at the California NanoSystems Institute at UCLA, also senior author of the paper.
More information: Yun-Tao Liu et al, Mesophasic organization of GABAA receptors in hippocampal inhibitory synapses, Nature Neuroscience (2020). DOI: 10.1038/s41593-020-00729-w
Provided by University of Science and Technology of China
Citation: Scientists snap together molecular building blocks of brain computing (2020, November 9) retrieved 9 November 2020 from https://ift.tt/3eJbTqg
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
November 09, 2020 at 11:50PM
https://ift.tt/3n2CJfI
Scientists snap together molecular building blocks of brain computing - Medical Xpress
It can save a life, figure out someone’s predisposition to developing cancer, solve a crime from a long time ago or find long-lost relatives: genome sequencing has come a long way since the human genome was first sequenced in the early 2000s. Fast-forward to today, and this process of determining someone’s complete genetic code is becoming ever more routine. Thousands of COVID-19 survivors, for instance, are now getting their genome mapped, in a bid to help researchers understand how specific genetic makeup could affect a person’s susceptibility to the coronavirus.
But while peeking into someone’s DNA often does help prevent, diagnose and treat many diseases, obtaining the genetic fingerprint also exposes that individual’s personal information encoded in the genome. This is the conundrum around the future of precision medicine. Suddenly, you’re sharing all six billion base pairs of genes with the people sequencing your genome. Whatever the goal, genome mapping and sequencing jeopardizes our privacy.
But it doesn’t have to be like that. There is a way to completely obscure someone’s DNA records (and, to be clear, sensitive data sets in general) while still keeping the data useful: by encrypting it. Say hello to fully homomorphic encryption (FHE). A mouthful perhaps, but in reality a rather simple type of next-generation cryptography that is so secure that even future quantum computers won’t be able to crack it.
Encryption we commonly use today doesn’t make our data totally safe. Whenever one needs to run any computations, for example to carry out necessary medical genetic testing on a sequenced genome, the data have to be decrypted. However briefly, the data become susceptible to theft and leaks.
With FHE, though, the data never get decrypted. The information is encoded in such a way that it remains encrypted all the time—when it’s being transmitted or when it’s in storage, and also during any computations. The data stay cryptographically jumbled to preserve privacy while they are being processed, and so that even the people handling the data can’t know the contents. So even if the data do get stolen or leaked, they will remain safely encrypted. The recipient simply has to decrypt the results with a special secret key, and that move doesn’t reveal any information about the source.
Even when quantum computers become powerful enough to break modern cryptography, easily cracking typical encryption algorithms, they won’t be able to break homomorphic encryption. This is because FHE is based on the mathematics of lattices—repeating, multidimensional gridlike collections of points. Lattice-based encryption schemes hide data inside such a collection, some distance away from a point. Calculating just how far away an encrypted message is from a lattice point is extremely difficult for both a quantum and a traditional computer.
Scientists first started working on homomorphic encryption in the 1970s, but it stayed pure research until a decade ago. In 2009, computer scientist Craig Gentry developed the first FHE scheme as part of his doctoral dissertation. The following years, while he worked with collaborators at IBM Research, the technique kept getting refined, getting faster and more precise. Preserving genomic privacy is just one possible use of FHE. It can be used to preserve any sensitive data, be they medical records or financial information.
Homomorphic encryption also addresses the problem of sharing data—critical because of Europe’s GDPR regulations, a country’s specific privacy laws or even a company’s own regulations. For instance, take a bank. If two departments were to share their data, one dealing with insurance and another one with investment, there would be data aggregation, giving data analysts access to all the data. With FHE, the analysts wouldn’t have a clue what the data are about.
Last year, a Brazilian bank, Banco Bradesco, partnered with IBM for a trial of the FHE technology on real financial data. The researchers showed that it was possible to perform predictions on encrypted data, hiding the data during processing. First, they encrypted the existing machine learning–based prediction model and ran predictions with the same accuracy as without encryption. Then they retrained the model using new encrypted data and showed that it was possible to use homomorphic encryption to preserve the privacy of the data, never exposing any client information.
Currently, the computational requirements of FHE are a lot greater than with typical modern encryption, making the process much, much longer. But the technology keeps improving, and in the near future is likely to become fast enough for many different applications. When that happens, it should become the default crypto option for sensitive data, especially medical and genomic. Because at the end of the day, there’s nothing more important than the data about our genetic makeup and that of our children—the information about what makes us “us.”
November 09, 2020 at 07:00PM
https://ift.tt/2IlXS5V
How to Preserve the Privacy of Your Genomic Data - Scientific American
Chinese search engine Baidu has released an app that allows users to play high-end games atop their under-equipped devices.
The app essentially allows users to run resource-intensive games and apps on its cloud servers, instead of their own devices.
Baidu, popularly known as China’s Google, calls this app the ‘Cloud Phone’, since in essence it allows users to have a virtual phone on top of their original device.
Phone-as-a-service
Baidu is the first of a growing number of Chinese firms racing to market smartphones that run on the cloud, according to a report in KrASIA.
China’s largest phone maker Huawei, launched a similar service aimed at enterprise users earlier this year. According to KrASIA, Chinese tech-giant Alibaba also seems to be interested in such subscription-based streaming app services as well.
As per Cdrinfo, on the other end of Baidu’s ‘Cloud Phone’ is an ARM server and ARM virtualisation technology, supposedly developed independently by Baidu.
Besides streaming games, which really isn’t all that innovative, Baidu’s cloud phone also offers "cloud VR". This service supposedly encodes both display and audio on remote cloud computes before streaming them to the user’s devices. Cdrinfo writes that this allows users to enjoy VR content without the headsets, which seems counter-intuitive.
We haven’t been able to verify the claims independently. But the possibility of leveraging the cloud for enabling low-end devices to punch above their weight seems like a smart move. If anything, this is one way to circumvent the planned obsolescence in devices.
Baidu’s service costs RMB 4 (US$0.6) per day, with discounts for monthly and yearly subscriptions.
November 08, 2020 at 07:31PM
https://ift.tt/2U34Aju
A Chinese company may have come up with the future of smartphones (and computing) - TechRadar
The Alibaba Group Holdings Ltd. headquarters stand illuminated at night ahead of the annual November 11 Singles' Day online shopping event in Hangzhou, China, on Sunday, Nov. 10, 2019.
Qilai Shen | Bloomberg | Getty Images
GUANGZHOU, China — The growth of Alibaba's cloud business outpaced Amazon and Microsoft in the quarter ending in September, and the Chinese tech giant reiterated its commitment to making the unit profitable by next March.
Alibaba reported cloud computing brought in revenue of 14.89 billion yuan ($2.24 billion) in the three months ending Sept. 30. That's a 60% year-on-year rise and its fastest rate of growth since the December quarter of 2019.
That was faster than Amazon Web Service's 29% year-on-year revenue rise and Microsoft Azure's 48% growth in the September quarter.
It's important to note that Alibaba's cloud computing business is significantly smaller than these two market leaders.
We believe cloud computing is fundamental infrastructure for the digital era, but it is still in the early stage of growth.
Alibaba is the fourth largest public cloud computing provider globally, according to Synergy Research Group.
Alibaba CEO Daniel Zhang said that public sectors and financial services contributed the highest growth to the company's cloud division.
"We believe cloud computing is fundamental infrastructure for the digital era, but it is still in the early stage of growth. We are committed to further increasing our investments in cloud computing," Zhang said on the earnings call.
Alibaba's loss from the cloud computing business was 3.79 billion yuan in the September quarter, much wider than the 1.92 billion yuan loss reported in the same period last year. However, Wu pointed to the earnings before interest, taxes, and amortization (EBITA), another measure of profitability.
EBITA loss narrowed to 156 million yuan from 521 million yuan in the same period last year. The EBITA margin was negative 1%.
On this basis, Wu said on the earnings call that Alibaba management "definitely expect to see profitability in the following two quarters."
"As I talked about during the Investor Day, we do not see any reason that for the long‑term, Alibaba cloud computing cannot reach to the margin level that we see in other peer companies. Before that, we are going to continue to focus expanding our cloud computing market leadership and also grow our profits," she said.
November 06, 2020 at 12:24PM
https://ift.tt/36asP4N
Alibaba cloud growth outpaces Amazon and Microsoft as Chinese tech giant pushes for profitability - CNBC
Qualcomm disclosed its momentum in automotive and Internet of things as it leverages 5G into more growth markets and eyes edge computing.
The company's fiscal fourth quarter earnings report included automotive and IoT sales. Both markets significantly overlap with 5G but diversify Qualcomm away from smartphones.
Qualcomm's IoT revenue, which includes industrial, fixed wireless broadband and networking, has expanded the last three quarters and nearing a $4 billion run rate. IoT revenue for Qualcomm was $926 million in the fiscal fourth quarter and $3.03 billion for the year.
IoT revenue has been driven by networking, retail, industrial, tracking and utility verticals.
Automotive chips for telematics and digital cockpit installations delivered revenue of $188 million. This business is Qualcomm's smallest, but brings predictable revenue growth.
Qualcomm's automotive design win pipeline is about $8 billion, up from $6.5 billion at the start of the fiscal year.
Qualcomm's fourth quarter handset revenue of $3 billion only had a few weeks of sales from a big OEM customer (think Apple).
Qualcomm
CEO Steven Mollenkopf explained:
Our 5G design wins continue to be powered by our RF front-end solutions, whether they support 4G, sub-6 millimeter-wave or both 5G bands, and whether they are in smartphones or other products such as embedded modules for PCs, IoT solutions or mobile hotspots. As we have in RF, we have built beachhead positions in both auto and IoT. Our scale enables us to make multiple profitable bets in areas where we expect a tailwind as each of these industry road maps adopt cellular technologies, as you can see taking place today in automotive, where we have emerged as a strategic technology partner to the automotive industry, with nearly all the major OEMs adopting our products. Next-generation 5G telematics design wins, in addition to our 3G and 4G design wins, solidify our position as a leader in connected cars.
Next up for Qualcomm is to tackle edge computing and cloud infrastructure. "Turning to inference, with over 10 years of AI R&D and over 1 billion AI-capable devices enabled with our technology, and fundamental assets such as low-power compute, process node leadership and signal processing expertise, we are well positioned to extend our smartphone AI leadership into growing applications, such as data centers, edge appliances and 5G infrastructure," said Mollenkopf.
Expansion categories of late include:
When you zoom out on Qualcomm's fourth quarter results it's clear that the company is positioning itself well to take on Nvidia, which may be distracted by its Arm purchase, as well as Intel, which is a cloud, IoT and edge compute leader. The 5G upgrade cycle is likely to be transformational for Qualcomm as it moves beyond its core handset business, which is beating its targets nicely. Arm processors: Everything you need to know
Qualcomm
November 05, 2020 at 06:00PM
https://ift.tt/3mZ1ObC
Qualcomm diversifies into IoT, auto and soon data center and edge computing - ZDNet
If you’re a professional in another field who’s interested in a career as a technologist, we have good news for you: It’s very possible to plunge into learning the technology specialization of your choice without any previous tech experience. For example, you might have a background as a marketer or political scientist, and realize you need to build up your programming or data-science skills to further your career—don’t be intimidated about jumping in.
The term for this is a “non-tech” or “non-computing” background. That means a working knowledge of tech but little working experience when it comes to programming, data algorithms and data structures, according to Tiffani L. Williams, teaching professor and director of onramp programs at the University of Illinois at Urbana-Champaign.
Williams considers coding a key component of an introduction to technology learning. She says this skill “moves you from being a consumer of technology or consumer of computing to being a producer of computing.”
From Tax Consulting to Computing
Nicole Jackson, a student in the iCAN program, has worked in tax consulting for more than 15 years. She also has a background in fitness and nutrition. At her last job in the tax field, she learned that data analytics would be an important skill to add to tax consulting, so she joined the iCAN program to explore this area (and also a potential career change). She’ll be taking courses such as Fundamentals of Computer Science, Fundamentals of Algorithms, and Excursions in Computing.
Jackson is studying Python. She was unfamiliar with the language before taking her first iCAN class. “My biggest goal for myself right now is just to learn Python and feel comfortable that I can make decisions in creating coding that executes,” she said.
Meeting with clients over Zoom as the pandemic hit led Jackson to recognize the importance of technology skills, and that inspired conversations around data analysis and the safety of that data in her work.
“We wanted the clients to feel as comfortable as possible with what they were sharing if they didn’t have face-to-face contact,” Jackson said. “And that just made me more aware that technology is so critical to our job, but also how can we make our job easier and better. I think that really spurred an interest in exploring the iCAN program or just computer science in general.”
Becoming a Tech CEO Without Coding
Not everyone needs to be a programmer to be successful in technology, noted Sophia Matveeva, CEO and co-founder of retail tech company Enty, as well as host and founder of Tech for Non-Techies, an online learning and training company.
“The image of the programmer turned successful tech CEO is so prevalent, given the success of Mark Zuckerberg and Bill Gates, that many professionals are put off even entering the tech sector if they do not have a technical education,” Matveeva said.
Matveeva started Enty without a tech background. She formed Tech for Non-Techies to show that all employees, not only developers and data scientists, have value within tech companies. She started holding workshops on what people with a non-tech background need to know about tech. Demand grew for this knowledge, and Tech for Non-Techies became an official company in March. As part of the program, Matveeva teaches a class called “What Non-Technical Founders Really Need to Know About Tech.”
As an alternative to taking a coding class, Matveeva advises people who want to acquire tech skills to learn the process for developing apps, sites and algorithms. She recommends courses in user experience design and product management. Learning how a digital product is created is valuable experience for people coming from backgrounds other than tech.
“If your talent is in sales, marketing or strategy, or if you’re a lawyer working with a tech client, then learn how tech products are made and who does what on a product team,” Matveeva said. “This will be enough for you to become a useful co-creator.”
November 04, 2020 at 08:33PM
https://ift.tt/2GsLd00
How to Approach Technology From a Non-Computing Background - Dice Insights
"The quantum threat is basically going to destroy the security of networks as we know them today," declared Bruno Huttner, who directs strategic quantum initiatives for Geneva, Switzerland-based ID Quantique. No other commercial organization since the turn of the century has been more directly involved in the development of science and working theories for the future quantum computer network.
One class of theory involves cryptographic security. The moment a quantum computer (QC) breaks through the dam currently held in place by public-key cryptography (PKC), every encrypted message in the world will become vulnerable. That's Huttner's "quantum threat".
Bruno Huttner, Director of Strategic Quantum Initiatives, ID Quantique.
"A quantum-safe solution," he continued, speaking to the Inside Quantum Technology Europe 2020 conference in late October, "can come in two very different aspects. One is basically using classical [means] to address the quantum threat. The other is to fight quantum with quantum, and that's what we at ID Quantique are doing most of the time."
There is a movement called post-quantum cryptography (PQC), which incorporates efforts to generate more robust classical means to secure encrypted communications, once quantum methods are made reliable. The other method, to which Huttner subscribes, seeks to encrypt all communications through quantum means. Quantum key distribution (QKD) involves the generation of a cryptographic key by a QC, for use in sending messages through a quantum information network (QIN).
Interfacing a QIN with an electronic internet, the way we think about such connections today, is physically impossible. Up until recently, it's been an open question whether any mechanism could be created, however fantastic or convoluted it may become, to exchange usable information between these two systems -- which, at the level of physics, reside on different planes of existence.
Could a quantum Internet connect non-quantum computers?
At IQT Europe, however, there were notes of hope.
Mathias Van Den Bossche, Director, Telecommunication and Navigation Systems R&D, Thales Alenia Space.
"I don't see why you would need a quantum computer," remarked Mathias Van Den Bossche, who directs research into telecommunications and navigation systems for orbital satellite components producer Thales Alenia Space, "to operate a quantum information network. Basically the tasks will be rather simple."
The implications of what Van Den Bossche is implying, during a presentation to IQT Europe, may not be self-evident today, although certainly they will be over the course of history. A quantum information network (QIN) is a theoretical concept, enabling the intertwining of pairs of quantum computers (QC) as though they were physically joined to one another. The product of a QIN connection would be not so much an interfacing of two processors, but a binding of two systems, whose resulting computational limit would be 2 to the power of the sum of their quantum components, or qubits. It would work, so long as our luck with leveraging quantum mechanics the way we've done so far, continues to pan out in our favor.
Van Den Bossche's speculation is not meant to imply that quantum networking could be leveraged to bind together conventional, electronic computers in the same way -- for example, giving any two desktop computers as much combined memory as 2 to the power of the sum of their bytes. Quantum networks are only for quantum computers. But if he's correct, the problem of interfacing a classical computer to a QC's memory system, and communicating large quantities of data over such a system, may be solvable without additional quantum components, which would otherwise make each connected QC more volatile.
Professor Stephanie Wehner, Roadmap Leader of the Quantum Internet and Networked Computing initiative at QuTech.
"Ultimately, in the future, we would like to make entanglement available for everyone," stated Prof. Stephanie Wehner of Delft University, who leads the Quantum Internet Initiative at the Dutch private/academic partnership QuTech. "This means enabling quantum communications ultimately between local quantum processors anywhere on Earth."
The principal use of a quantum Internet, perhaps permanently, would be to enable QKD to protect all communications. A quantum-encrypted message is protected by physics, not math, so it's not something that can be 'hacked'. Prof. Wehner foresees a time when QKD is applicable to every transaction with the public cloud.
"Here, you should be imagining you have a very simple quantum device -- a quantum terminal, if you wish," she explained, "and you use a quantum Internet to access a remote quantum computer in the cloud, [so] you can perform, for example, a simulation of a proprietary material in such a way that the cloud hosting provider who has the quantum computer cannot find out what your material design actually is."
No part of the cloud server could interfere with the simulation without wrecking it -- in the quantum lexicon, causing it to decohere. That might disrupt your work a bit, but it wouldn't give a malicious actor on the cloud anything useful whatsoever.
Achieving Prof. Wehner's vision of a fully realized quantum Internet would require a respectable number of hurdles having been overcome, and a number of lucky rolls of the dice to come up all box-cars. These good tidings include, but are not limited to, the following:
Classical control systems would need to marshal the exchanges of information to and from the QIN. This is the problem Van Den Bossche is hopeful can be solved: There needs to be some kind of functional waypoint between the two systems that cannot, in and of itself, introduce unreliability, uncertainty, and noise.
David Awschalom, Director, Chicago Quantum Exchange.
Quantum transducers, which would perform a role analogous to repeaters in an electronic network. (You may hear the phrase "quantum repeater" for this reason, although physicists say this is a misnomer.) As Prof. David Awschalom of the University of Chicago, and director of the Chicago Quantum Exchange, asked IQT Europe attendees, "How do you convert light to matter efficiently in the quantum domain, and how do you build a quantum repeater?" Two qubits can share the curious virtue of entanglement when they're linked by optical fiber, but only over a limited distance. A transducer such as Prof. Awschalom described it would handle the strange exchange of states required for entanglement to be effectively handed off, as if by a bucket brigade, enabling the QIN to be chained.
Single photon-emitting qubits, otherwise known as 'better qubits', would make the maintenance of a QIN coupled with classical equipment much more deterministic and manageable. Photons are the signals of a quantum network. A quantum memory system will require high frequencies and heart-stoppingly high bandwidth, which may only be feasible when photon sources can be observed and maintained with precision.
Quantum memory systems (see above) are, at least at present, ideal visions. For now, a high-qubit QC computing element serves as its own memory, and a 53-qubit node may store as much as 253 bits (about 281.5 terabytes), which may seem sufficient enough except that it's completely volatile. It may decohere completely when a calculation is completed, so some type of stable memory system will be required to maintain, say, a database. This is perhaps the tallest order of all.
Available fiber. The 5G Wireless deployment effort could be of assistance here, opening up avenues of connectivity for a photons-only network. Recent experiments conducted by Toshiba Research and the University of Cambridge have shown that telco fiber networks are reliable enough for quantum communications, in places where dark fiber has yet to be laid.
Lasers. Here is the forgotten element of this discussion. We're not talking about reclaimed laser units from unbuilt Blu-ray players, but as Awschalom describes them, "fast, high-power, milliwatt-scale pump lasers that generate high-bandwidth optical photons, to match the wavelengths of these memories."
The current size and breadth of the quantum computing 'ecosystem', if we can call it that, may not yet mandate the investment of billions of dollars, or euros, into the establishment of all the new infrastructure this industry will require. But well before it gets there, we may encounter the point Prof. Huttner talks about, when the quantum threat is more imminent than the quantum bounty. Then, perhaps suddenly, investments may come in spades.
The events of 2020 have turned most predictions for 2021 on their head. Top trends such as artificial intelligence (AI) and the internet of things (IoT) will still define the ways in which tech reshapes our lives in the next year. However, the most significant use cases now involve helping us to adapt and survive in the changing times we are living through.
The 5 Biggest Cloud Computing Trends In 2021
Adobe Stock
No trend is more relevant to this than cloud computing. Cloud is the backbone of the data-driven, app-based tech ecosystem that has been vital in helping us manage this change. Everything from contact tracing to home delivery services, remote medicine, and working (and playing) from home has been revolutionized by cloud services.
Throughout 2021, we can expect to see the rate of this change accelerate as more businesses get to grips with adopting cloud models, and delivery of data from the cloud to our devices becomes more integral to our daily lives. Here are some of the ways in which I can see this playing out over the course of 2021:
Recommended For You
1. Multi-cloud approaches will lead to a breakdown of barriers between providers
Currently, the big public cloud providers - Amazon, Microsoft, Google, and so on – take something of a walled garden approach to the services they provide. And why not? Their business model has involved promoting their platforms as one-stop-shops, covering all of an organization's cloud, data, and compute requirements. In practice, however, industry is increasingly turning to hybrid or multi-cloud environments (see below), with requirements for infrastructure to be deployed across multiple models.
What this means is that there are growing calls for the big providers to create bridges between their platforms. This runs contrary to their business models, which are reliant on an ability to upsell greater cloud capacity as well as additional services as their customers scale. Adopting a more collaborative approach doesn't just enable customers to take greater advantage of the fast-growing multi-cloud trend, though. It will also benefit organizations needing to share data and access with partners in their supply chain, which may all be working across diverse applications and data standards. This is also a space where we are likely to see growing levels of innovation from startups, creating services that simplify the process of operating between different public cloud platforms.
2. AI will improve the efficiency and speed of cloud computing
As far as cloud goes, AI is a key enabler of several ways in which we can expect technology to adapt to our needs throughout 2021. Cloud-based as-a-service platforms enable users on just about any budget and with any level of skill to access machine learning functions such as image recognition tools, language processing, and recommendation engines. Cloud will continue to allow these revolutionary toolsets to become more widely deployed by enterprises of all sizes and in all fields, leading to increased productivity and efficiency.
Autonomous vehicles, smart city infrastructure, and pandemic response planning are all fields of research where the effects of smarter algorithms delivered through cloud services will be felt. Machine learning also plays a big part in the logistics processes that keep cloud data centers up and running. Cooling systems, networks of hardware, and power usage in these delicate and expensive environments can all be monitored and managed by AI algorithms in order to optimize running efficiency and minimize their impact on the environment. Research and development in this field are likely to continue to lead to new breakthroughs in data center speed and efficiency.
3. Gaming will be increasingly delivered from the cloud, just like music and movies
Amazon most recently joined the ranks of tech giants and startups offering their own platform for cloud gaming. Just as with music and video streaming before it, cloud gaming promises to revolutionize the way we consume entertainment media by offering instant access to vast libraries of games that can be played for a monthly subscription. During 2020, services were launched by Google, Microsoft, and Nvidia, while Sony's has been available for several years now. Even though new Xbox and Playstation consoles are being developed, costing around $500, industry experts are predicting that the days when we need to spend hundreds on new hardware every few years to stay at the cutting edge of gaming may be drawing to a close, thanks to the coming-of-age of cloud gaming.
4. Hybrid and on-premise cloud solutions grow in popularity
Choosing between a public, private, or hybrid cloud environment has proved challenging for some organizations. Each route offers advantages and disadvantages when it comes to flexibility, performance, security, and compliance. But as cloud ecosystems have matured, many have found there's no magic one-size-fits-all solution on the shelves. Hybrid or multi-cloud environments, where users pick and choose the individual elements of service providers' offerings that suit their needs, have grown in popularity, leading to a situation where those providers have begun to reassess their models of delivery.
Amazon and Google, for example, have traditionally been market leaders that relied on selling their customers space on their public cloud platforms, whereas Microsoft and IBM have been more flexible with enabling users to deploy their cloud tools and technologies across their existing, on-premises networks. Now it seems that these providers have woken up to the need for different platforms and approaches within organizations – perhaps utilizing public cloud to provide content delivery while storing and processing customer data and other controlled information via private or on-premise solutions. There will also be a growing demand for “bare metal” cloud space – raw storage and compute power where businesses can simply “lift and shift” their existing systems into the cloud without having to adapt them to run on pre-installed software or services. The need to consolidate these user requirements will be a driving force behind the direction in which cloud services evolve throughout 2021.
5. More of us will be working on Virtual Cloud Desktops
This is basically where the entire environment of our workstation is delivered as a managed cloud service to our laptop or desktop screen where we work. This means that organizations can take advantage of by-the-hour subscriptions for the time their employees spend working at their machines, eliminating the cost of hardware updates and the need to dispose of redundant technology.
Sometimes known as desktop-as-a-service, this model of computing is offered by Amazon via the Workspaces platform and Microsoft with Windows Virtual Desktop. Google also offers the functionality through its Chromebook devices. In practice, this can increase efficiency across a workforce by ensuring everyone is using up-to-date, synchronized technology. It also benefits security as all devices can be managed in a centralized way, rather than having to make sure everyone on the network is following best practice. When people join or leave a company, the cost simply scales up as the number of hours spent using the platform increases or decreases. This flexible functionality means virtual desktop services are likely to become increasingly popular in the coming years.
Quantum computers offer great promise for cryptography and optimization problems, and companies like IBM, Google, and D-Wave are racing to make them practical for business use. ZDNet explores what quantum computers will and won't be able to do, and the challenges we still face.
November 02, 2020 at 10:23PM
https://ift.tt/3jRBA90
This bundle quickly catches you up to speed on everything Azure with six eBooks and more than 15 hours of video instruction. That means you learn Azure's fundamentals, discover how to work with its advanced features and understand how to help companies, both small and large, make the most of this cloud computing platform. Throughout the bundle, you get guidance on building Azure solutions, learn how to use PowerShell to initiate and execute daily Azure tasks, implement DevOps with Azure and more.
Included materials:
Implementing Azure Solutions [eBook] ($40 value)
Serverless Integration Design Patterns with Azure [eBook] ($40 value)
Migrating Applications to the Cloud with Azure [eBook] ($28)
Mastering Identity & Access Management with Microsoft Azure [eBook] ($48)
DevOps with Azure [Video] ($125 value)
Modernize Node.js Web Apps with Azure App Service [Video] ($125 value)
Azure Platform as a Service, Web & API Application Deployment [Video] ($125)
Azure Cognitive Services for Developers [Video] ($125 value)
Cloud computing drives innovation and business around the world. Learning how to work with a leading cloud solution like Azure gives you tangible skills companies need today. Normally $703, The Complete Microsoft Azure eBook & Video Course Bundle is on sale for $30, 95% off its original price tag.
Prices are subject to change.
Engadget is teaming up withStackSocialto bring you deals on the latest headphones, gadgets, tech toys, and tutorials. This post does not constitute editorial endorsement, and we earn a portion of all sales. If you have any questions about the products you see here or previous purchases, please contact StackSocial supporthere.
November 02, 2020 at 02:00AM
https://ift.tt/326uoj3
This Microsoft Azure training is just $30 today - Engadget
SimpleMachines' computer chip may not look all that different from what already exists in many computers, but the company's founder said the technology has the potential to speed up advancements in software development.
A Madison-based startup says its new technology could revolutionize computer programming and advance development of artificial intelligence and machine learning.
Founded in 2017 by UW-Madison researcher Karu Sankaralingam, SimpleMachines Inc. is ready to launch a new type of computer chip that Sankaralingam said is faster and more powerful than currently available chips while using less electricity.
Launching sales early next year with about 300 manufactured computer chips — called Mozart — SimpleMachines will begin working with customers — primarily companies with large data centers, such as banks — to adopt the new computer chip.
Sankaralingam said the chip has the potential to speed up development of artificial intelligence and machine learning, which has been hindered by hardware that can’t keep up.
“These things are changing very, very fast, and having a hardware solution that provides high performance and still supports that pace of evolution is very important,” Sankaralingam said.
Chips are at the core of what any computer does. Electronics from calculators to cell phones to cloud-computing servers are able to function because of the chips they use. Many chips are built for specific purposes — such as those in calculators or cameras — while others — such as cell phones and personal computers — are built to run many applications.
SimpleMachines’ chips have the potential to replace most other chips, Sankaralingam said, because it is more powerful and can be reprogrammed for new uses.
Sankaralingam launched SimpleMachines as computer chip development hit a roadblock — it was becoming harder to make better chips at a cheaper rate that were also energy efficient. While that was becoming more difficult, artificial intelligence and machine learning was advancing at rapid speed.
Artificial intelligence and machine learning need to process mass amounts of data and run many programs at one time, but Sankaralingam said many computer chips aren’t up to the task, either because they aren’t powerful enough or they use too much electricity to be cost efficient.
“That was really an opportunity for us,” Sankaralingam said. “We can strike when it’s really hot right now.”
Support Local Journalism
Your membership makes our reporting possible.
The SimpleMachines chip addresses all those problems, Sankaralingam said. A single chip can run more programs at the same time and process data faster than other chips while also using less energy.
Though they can be built to run on less power, SimpleMachines designed this first chip to run on 75 Watts, which is the standard for most of the current machines processing massive amounts of data.
Into the cloud
The chips aren’t likely to find their way into your home computer anytime soon. Instead, SimpleMachines hopes to sell the chips to the companies doing the cloud computing that supports many aspects of online life, such as image recognition or video recommendations.
Every year, as new tech comes along, electronic devices become obsolete because the devices’ chips were developed for only one task or application. The speed that software programs evolve far outpaces the speed of hardware development.
“These applications are changing every six months, but it takes two to three years to build a chip,” Sankaralingam said. “It’s like, ‘I want to build something to do this,’ but one year later than thing is not important anymore. The chip you built is kind of useless.”
50 employees
Take GPS systems such as Garmin devices, for example. Those systems were a revolution for navigating in a car, but now, smartphones can run a GPS app well enough for most drivers.
Those years-old GPS devices are no longer useful because the chip inside can only run mapping and navigational software. But if a chip like the ones made by SimpleMachines existed, Sankaralingam said it could be reprogrammed with a software update to do other things, like become a screen for backseat passengers to watch movies.
In the three years since the company’s founding, it has grown to about 50 employees, many of whom are based in Madison or California. Among the ranks, Sankaralingam said, are a few engineers who formerly worked for computer-chip giants Qualcomm and Intel.
SimpleMachines will still have to compete with those computer-chip giants, but Sankaralingam is optimistic businesses will be willing to adopt this new chip.
No. 8: Michael Norregaard, Sonic Foundry, $262,746
Base salary: $250,916
All other compensation: $11,848
Norregaard took over as CEO when Gary R. Weis retired in May, 2019.
Weis' base salary: $402,343
All other compensation: $5,169
Weis' total compensation from Sonic Foundry in 2019: $407,512
Compensation data was collected from SEC filing 10 A/K submitted by the company in January 2020.
CONTRIBUTED PHOTO
No. 7: Stephen L. Schlecht, Duluth Holdings, $316,211
Base salary: $316,211
No other compensation for Schlecht from Duluth Holdings is listed on SEC filing DEF 14A, submitted by the company in April. Schlecht is listed as the chairman and founder of Duluth Holdings on the company website. He took over as CEO when Stephanie L. Pugliese resigned in August 2019.
Pugliese's base salary: $450,795
Stock/option awards: $975,005
Pugliese's total compensation from Duluth Holdings in 2019: $1,425,800
STEVE APPS, STATE JOURNAL
No. 6: Corey A. Chambas, First Business Financial Services, $1,231,746
Base salary: $466,000
Stock/option awards: $276,879
All other compensation: $488,867
Compensation data was collected from SEC filing DEF 14A submitted by the company in March 2020.
CONTRIBUTED PHOTO
No. 5: Jeffrey M. Keebler, MGE Energy, $2,381,012
Base salary: $591,667
Stock/option awards: $345,015
Bonus pay: $412,425
All other compensation: $1,031,905
Compensation data was collected from SEC filing DEF 14A submitted by the company in March 2020.
AMBER ARNOLD, STATE JOURNAL
No. 4: Jerome Griffith, Lands' End, $5,414,578
Base salary: $1,019,231
Stock/option awards: $2,309,982
All other compensation: $2,085,365
Compensation data was collected from SEC filing DEF 14A submitted by the company in March 2020.
STATE JOURNAL ARCHIVES
No. 3: John O. Larsen, Alliant Energy Corp, $7,619,999
Base salary: $754,615
Stock/option awards: $2,473,403
All other compensation: $4,391,981
Larsen took over as CEO for Alliant Energy Corp when Patricia L. Kampling retired from the company in July 2019.
Kampling's base salary: $643,468
Stock/option awards: $3,464,092
All other compensation: $1,400,554
Kampling's total compensation from Alliant Energy Corp in 2019: $5,508,114
Compensation data was collected from SEC filing DEF 14A submitted by the company in April 2020.
CONTRIBUTED PHOTO
No. 2: Kevin T. Conroy, Exact Sciences Corp, $18,716,543
Base salary: $792,169
Stock/option awards: $16,898,172
Bonus pay: $1,000,250
All other compensation: $25,952
Compensation data was collected from SEC filing DEF 14A submitted by the company in April 2020.
PHOTO BY MICHELLE STOCKER
No. 1: David M. Maura, Spectrum Brands Holdings, $19,688,122
Base salary: $900,000
Stock/option awards: $12,309,411
All other compensation: $6,478,711
Compensation data was collected from SEC filing 10 A/K submitted by the company in January 2020.
CONTRIBUTED PHOTO
With a weekly newsletter looking back at local history.
October 31, 2020 at 08:30PM
https://ift.tt/3ehaMOi
Madison-based startup looking to revolutionize computing with a new type of computer chip - Madison.com
Nature talks to Peter Shor 25 years after he showed how to make quantum computations feasible — and how they could endanger our data.
Davide Castelvecchi
Search for this author in:
Applied mathematician Peter Shor worked out how to overcome a major problem in quantum computing.Credit: BBVA FOUNDATION
When physicists first thought up quantum computers in the 1980s, they sounded like a nice theoretical idea, but one probably destined to remain on paper. Then in 1995, 25 years ago this month, applied mathematician Peter Shor published a paper1 that changed that perception.
Shor’s paper showed how quantum computers could overcome a crucial problem. The machines would process information as qubits — quantum versions of ordinary bits that can simultaneously be ‘0’ and ‘1’. But quantum states are notoriously vulnerable to noise, leading to loss of information. His error-correction technique — which detects errors caused by noise — showed how to make quantum information more robust.
Shor, who is now at the Massachusetts Institute of Technology in Cambridge and is also a published poet, had shocked the physics and computer-science worlds the previous year, when he found2 the first potentially useful — but ominous — way to use a hypothetical quantum computer. He’d written an algorithm that would allow a quantum computer to factor integer numbers into prime factors at lightning speed. Most Internet traffic today is secured by encryption techniques based on large prime numbers. Cracking those codes is hard because classical computers are slow at factoring large products.
Quantum computers are now a reality, although they are still too rudimentary to factor numbers of more than two digits. But it is only a matter of time until quantum computers threaten Internet encryption.
Nature caught up with Shor to ask him about the impact of his work — and where Internet security is heading.
Before your factoring algorithm, were quantum computers mostly a theoretical curiosity?
My paper certainly gave people an idea that these machines could do something useful. Computer scientist Daniel Simon, in a precursor of my result, solved a problem that he came up with that shows that quantum computers are exponentially faster [than ordinary computers]. But even after Simon’s algorithm, it wasn’t clear that they could do something useful.
What was the reaction to your announcement of the factoring algorithm?
At first, I had only an intermediate result. I gave a talk about it at Bell Labs [in New Providence, New Jersey, where I was working at the time] on a Tuesday in April 1994. The news spread amazingly fast, and that weekend, computer scientist Umesh Vazirani called me. He said, “I hear you can factor on a quantum computer, tell me how it works.” At that point, I had not actually solved the factoring problem. I don’t know if you know the children’s game ‘telephone’, but somehow in five days, my result had turned into factoring as people were telling each other about it. And in those five days, I had solved factoring as well, so I could tell Umesh how to do it.
All sorts of people were asking me for my paper before I had even finished writing it, so I had to send them an incomplete draft.
But many experts still thought that quantum computers would lose information before you can actually finish your computation?
One of the objections was that in quantum mechanics, if you measure a system, you inevitably disturb it. I showed how to measure the error without measuring the computation — and then you can correct the error and not destroy the computation.
After my 1995 paper on error correction, some of the sceptics were convinced that maybe quantum computing might be doable.
Error correction relies on ‘physical’ and ‘logical’ qubits. What is the difference?
When you write down an algorithm for a quantum computer, you assume that the qubits [the quantum version of a classical bit of information] are noiseless; these noiseless qubits that are described by the algorithm are the logical qubits. We actually don’t have noiseless qubits in our quantum computers, and in fact, if we try to run our algorithm without any kind of noise reduction, an error will almost inevitably occur.
A physical qubit is one of the noisy qubits in our quantum computer. To run our algorithm without making any errors, we need to use the physical qubits to encode logical qubits, using a quantum error-correcting code. The best way we know how to do this has a fairly large overhead, requiring many physical qubits for each logical qubit.
It is quite complicated to work out how many more qubits are needed for the technique. If you want to build a quantum computer using surface code — the best candidate right now — for every logical qubit, you need about 100 physical qubits, maybe more.
In 2019, Google showed that its 54-qubit quantum computer could solve a problem that would take impossibly long on a classical computer — thefirst demonstration of a ‘quantum advantage’. What was your reaction?
It’s definitely a milestone. It shows that quantum computers can do things better than classical computers — at least, for a very contrived problem. Certainly some publicity was involved on Google’s part. But also they have a very impressive quantum computer. It still needs to be a lot better before it can do anything interesting. There’s also the startup IonQ. It looks like they can build a quantum computer that in some sense is better than Google’s or IBM’s.
When quantum computers can factor large prime numbers, that will enable them to break ‘RSA’ — the ubiquitous Internet encryption system.
Yes, but the first people who break RSA either are going to be NSA [the US National Security Agency] or some other big organization. At first, these computers will be slow. If you have a computer that can only break, say, one RSA key per hour, anything that’s not a high priority or a national-security risk is not going to be broken. The NSA has much more important things to use their quantum computer on than reading your e-mail — they’ll be reading the Chinese ambassador’s e-mail.
Are there cryptography systems that can replace RSA and that will be secure even in the age of quantum computers — the ‘post-quantum encryption’?
I think we have post-quantum cryptosystems that you could replace RSA with. RSA is not the big problem right now. The big problem is that there are other ways to break Internet security, such as badly programmed software, viruses, sending information to some not entirely honest player. I think the only obstruction to replacing RSA with a secure post-quantum cryptosystem will be will-power and programming time. I think it’s something we know how to do; it’s just not clear that we’ll do it in time.
Is there a risk we’ll be caught unprepared?
Yes. There was an enormous amount of effort put into fixing the Year 2000 bug. You’ll need an enormous amount of effort to switch to post-quantum. If we wait around too long, it will be too late.
This interview has been edited for length and clarity.
References
1.
Shor, P. W. Phys. Rev. A 52, R2493(R) (1995).
2.
Shor, P. W. Proc. 35th Annual Symp. Found. Comp. Sci. 124–134 (1994).