Tuesday, May 29, 2018

Machine Learning Future Trends and AI Doomsday Take Over

Machine Learning Introduction

What is machine learning ?

Machine learning - (Machine Learning) is an extensive subsection of artificial intelligence that studies methods of constructing algorithms that can be trained. There are two types of training: case studies, or inductive training, is based on the  identification of patterns in empirical data; deductive training involves formalizing the knowledge of experts and transferring them to the computer as a knowledge base. Deductive learning is usually referred to the field of expert systems, so the terms
machine learning and training by use of precedents can be considered synonymous. Many methods of inductive learning were developed as an alternative to classical statistical approaches.Robots Take Over Jobs entire techno think tanks are predicting.what is machine learning algorithm ?

Artificial Intelligence created levels for Doom no worse than humans

Can you provide a modern three-dimensional shooter with an infinite number of different levels? You can, if you train artificial intelligence to create them. This is what the researchers from the Polytechnic University of Milan have been doing. Their algorithms are trained on the well-known game Doom. 

How Frightened Should We Be of AI ?

machine learning introduction
Machine Learning 

Three-dimensional shooter Doom appeared 25 years ago thanks to the talented programmer John Carmack. He for a long time lingered on the drives of personal computers because of the efforts of John Romero and American McGee, who created the levels for the game. In addition, id Software released a level editor that allowed players to add a continuation to the game for free.

The continued popularity of the game and the huge number of levels created by real people made Doom ideal for training artificial intelligence. But we should pay tribute to the researchers from the University of Milan. They applied a very interesting approach to their task.

An adversarial network was created. Two algorithms have studied thousands of levels of Doom, created during the entire existence of the game. After that, one of them began to compose his own levels, and the second compared the levels created by people with levels created with the help of artificial intelligence. If the algorithm could not distinguish the level for the game that created another algorithm, from the levels created by people, such a level was considered suitable for the game.

AI Takeover Future of machine learning

Of course, now very few people play Doom, but this approach can be used for any modern game. It is important only to train well the artificial intelligence, and then people like Romero and McGee will no longer have work.

Saturday, May 19, 2018

Current Trend in Artificial Intelligence | Machine Learning Future Trends

Artificial Intelligence 2018. Are you Ready for AI ? See Whats Happening in 2018 and 2019 

2045 is the year of the invention of a full-fledged artificial intelligence that imitates the human
An inexperienced Internet user will be surprised to learn that artificial intelligence (AI) is still the technology of the future: the concept of AI has become so firmly established in the information agenda. Almost every day the largest technology corporations report about the achievements of their own "intellects". "Intellect" can already process the photo, with it you can fight in the game (by the way, in Guo he already beat all the champions) or just chat in Twitter and ask through the voice interface to turn off the lights or turn on the music. But in fact, the developers have so far made progress only in what the National Council for Science and Technology (NSTC) defines as a "narrow" AI. "General" AI, which will become a full-fledged imitation of human thinking, has yet to be invented by mankind, and it is impossible to predict how long this process will take,

An advanced prototype robot child named David is programmed to show unconditional love. When his human family abandons him, David embarks on a dangerous quest to become a real boy.

Artificial Intelligence in Everyday Life | AI Revolution

The research process, and behind it the emerging market, is now moving forward primarily machine learning. This subsection of AI works through algorithms of artificial neural networks . Neural networks function according to the principle of the human brain, that is, they draw conclusions based on analysis of large data sets.For example, a group of researchers from the Massachusetts Institute of Technology (MIT) in December 2016 taught artificial intelligence to supplement static images with dynamic ones. To do this, scientists "fed" AI 2 million videos with a total duration of about a year and programmed the development for predicting a static image. So, when the "intellect" from MIT received a photo of the beach, it "enlivened" it with the help of propulsion of sea waves. And according to the image of the railway station, the AI ​​"directed" a short film (so far only 32-second timekeeping) about the departure of the train. Practical application of the technology is useful, for example, unmanned vehicles, which in the case of obstacles appearing on the road, one must be able to instantly make a decision about a sharp maneuver or continuation of traffic in order to avoid tragic consequences (today up to 90% of accidents occur due to the driver's fault).

Machine Learning Future Trends

Similarly, he trains his development of Google. The corporation, for example, (with the support of Oxford University) has taught AI to read lips more effectively than professional linguists. After studying the "intellect" 5 thousand hours of video, he was able to consider 47% of words from a random sample of 200 fragments, while a man - only 12%. Also the company in an entertaining form offered users to teach AI better to recognize images. To do this, in November 2016, a game experiment was launched, Quick, Draw !, under which AI must guess in 20 seconds what the person is drawing on the screen.

The principle of image recognition, many companies already use and for commercial purposes. So, the Russian startup Prisma has trained its neural network to process photos of users in the style of different artists. The output was a service, downloaded, according to the Techcrunch portal, more than 70 million times around the world and recognized by Apple as the "application of the year" for the iPhone.

By 2020, the AI ​​market will grow tens of times and reach $ 153 billion, analysts at Bank of America Merrill Lynch predict. According to their calculations, most of the market - more than $ 80 billion in monetary terms - will capture the developers of solutions for robotics . In addition to the mentioned use of AI in laying out drone routes , the technology will be needed, for example, to improve the concept of "smart houses" and develop the logistics of commercial drones (in December Amazon already made the first commercial delivery of a drone in Britain).

$153 billion of

such value will achieve market development of artificial intelligence by 2020

The most ambitious players of the technological industry are looking at the AI ​​market and ai business ideas. In the spring of 2016, the founder of Tesla and SpaceX Ilon Mask together with partners created a non-profit company OpenAI to develop "friendly" AI. Entrepreneurs set the task not only to save humanity from enslaving cars, but also to make technology accessible to all developers. OpenAI will receive $ 1 billion of investment. The company has already introduced its debut product - software for "training with reinforcement." This type of machine learning enables the object to develop, interacting independently with the environment. Technology allows AI, for example, to control robots or play board games. Sometimes this leads to incidents: Twitter-bot from Microsoft, which in May went to learn "with reinforcement" in the network of microblogging,

In the footsteps of OpenAI followed by other companies. In December 2016, Apple announced plans to lay out what the AI ​​had in the public domain. Earlier, Microsoft, IBM, Amazon, Google and Facebook announced the merger of efforts for the synergy of research "capacities". The world's largest social network is particularly interested in the rapid development of AI technologies to combat fake news in the tapes of user subscriptions. At the finish of the presidential campaign in the United States, false informational occasions became a serious problem for the service: from February to November, users "jumped" or "reposted" the 20 most popular "fakes" 8.7 million times, while the top 20 real news caused only 7.4 million reactions.

Algorithms of machine learning are used almost in all "fashion" today technological directions of research - from unmanned vehicles to smart home systems. AI technologies can potentially change any branch of the economy and almost any business. So, in the near future - 2017-2018, according to the forecast of the analytical company McKinsey, machine training will turn the recruitment market (artificial intelligence more accurately than professional "bounty hunters" will be able to search for optimal candidates for employers) and some segments of the IT market (for example, stormy the development of chat bots allows the business to build new communication strategies, including in social media).

In the future, AI should help states and businesses deal with cyberthreats. So, AI-Squared, a joint project of the Massachusetts Institute of Technology (MIT) and the PetternEx start-up, is a platform that processes large amounts of user data using a neural network algorithm to detect cyber attacks. Based on the results of a year and a half of platform tests (during this time it analyzed 3.6 billion files), its developers announced the detection and prevention of 85% of the attacks.

76 million


We have been hearing alot about following queries 

What could be the future robotic technology,artificial intelligence jobs salary
future robots 2050,jobs already replaced by robots ai taking jobs,future humanoid robots,what will robots be like in the future artificial intelligence and job loss,jobs lost to automation statistics
artificial intelligence job description,future robots in the home entry level artificial intelligence jobs,artificial intelligence jobs at google robots in the future facts artificial intelligence future applications and many more. 

Neural networks have long been used in cybersecurity technologies.

 For example, Kaspersky Lab, about a decade ago, created a special module capable of recognizing digital passwords in the form of pictures that malicious programs used to distribute in e-mail. Another example is image analysis technologies on websites for identification by means of parental controls or an adjacent technology for recognizing spam on an image. Neural networks are widely used to analyze network traffic: with their help, cybersecurity specialists are looking for anomalies that may indicate suspicious activity.

But apart from the positive consequences, the development of artificial intelligence carries risks as well. In the coming years, AI technologies will continue to attack users, the report of the analytical company ArkInvest says. According to experts, in the next twenty years, artificial intelligence can destroy only 76 million jobs in the US - ten times more than it was created during the years of Barack Obama's presidency. And in the perspective of thirty years AI can bring the global unemployment rate (with the current structure of the labor market) to 50%, The Guardian wrote, citing Professor Rice Moshe Vardi.
how does technology help students learn
technological changes examples
negative impact of technology on business
technology and work

how does technology affect business today are the questions now a days required workable solutions.

what is a digital solution

https://www.digitaltechnologyreview.com/ site is a participant in the Amazon Services LLC Associates Program, and we get a commission on purchases made through our links.

Thursday, May 17, 2018

Business Survival with AI BI in Digital Data Distribution

Distribution of information to users of BI 2018

Modern business dynamics and trends in data, pushed by emerging Internet technologies, led to a significant increase in the amount of data stored in corporate systems.

Many enterprises are faced with the problem of excess information as per big data predictions by researchers like forbes big data. Transaction processing systems generate daily reports. In hundreds of ways, using data warehouses and analytical tools, longitudinal and lateral slices of information (slice and dice) are performed. Tools for enterprise resource planning ( E Nterprise the R esource P Lanning, of ERP) collect all the data of the company. Tools for assessing performance and information received through the Internet further increase information congestion.

BI 2018

The situation is further complicated by the fact that modern enterprises operate on the world market, which means that their data and applications are geographically distributed. each branch has its own program and its own repository. Resources planning tools are implemented at all levels: applications for human resource management, financial systems and production-oriented software are created. The information used to work with these applications is stored in disparate sources (in different relational databases, as well as in special format files and in multidimensional samples).

BIG Data BI and AI

Today, many companies lose money and time, manually distributing information resources. According to some studies, about 15% of revenues are spent on the creation and distribution of paper documents on average. Thus, according to the Gardner Group, corporate employees spend 10% of their time looking for information necessary for a certain task or for making decisions, from 20% to 40% of the time - for processing already ready documents and 30% for tasks related with documents, but not adding value to the final service (product) of the company.

A new concept called "information broadcasting" is a technology that makes possible the transmission of personalized messages needed to support decision making via the Web, e-mail, fax, pager, mobile phone to hundreds of thousands of addressees.

Automating the process of sending information can provide users with timely data, expand the company's capabilities and prevent the accumulation of unnecessary data.

Moreover, the intellectual resources of the enterprise, created with the help of BI tools and analytical packages, should be automatically transferred also to the applications used in the operational departments to support decision making.

The most important factors for success | New inventions in science 2018

The task of correctly providing important business information using AI & business intelligence technologies to the appropriate consumer is clearly not an easy one. At the same time, there are a number of factors that determine its successful solution:

  • Automatic saving of information in various working formats (HTML, PDF, Excel, etc.), which will satisfy the needs of specific users and allow performing the analysis.
  • Distribution of significant material necessary for decision-making, including materials of "third parties" (drawings, Word documents, Power Point presentations, reports created both with the help of software purchased from various BI suppliers, and legacy reporting applications). Delivery of information via various channels, such as e-mail, Web, fax, network printer, etc .; transfer of information to mobile devices.
  • Intelligent information dissemination functions, such as scheduled distribution or distribution, performed under certain business conditions;
  • Providing only the necessary reports, which frees users from tedious viewing of all materials in the search for necessary sections.
  • An intuitive system for cataloging and storing information, especially important in connection with the adoption of many legislative measures concerning the storage and management of documents.
  • Support for any number of users, thanks to which the software for distribution of information will be able to scale successfully as the organization grows.

Examples of the effective use of applications that distribute information in this big data and ai trends world.

Electronic reporting

According to Gardner Group, on average, each document is copied 9-11 times. Mailing and delivery of hard copies is a rather laborious procedure, including sending faxes and emails, or manually delivering. This problem is almost completely eliminated with the use of electronic reports: not only paper consumption is reduced, but also other expenses, and, most importantly, the execution time of this work is significantly reduced.

The software intended for dissemination of information automatically sends data to those who are interested in them. This intelligent method of distribution further reduces the labor costs associated with manual processing of documents. The user no longer needs to search for information, it is automatically sent to his mailbox.

Scheduled reports

Applications for the delivery of information provide the execution of requests through e-mail. In accordance with some schedule (daily, weekly, etc.), certain information material is formed and sent to the subscriber. And now you do not need to make the same reports for hours. (For example, sales managers can assign a specific day in which they will receive a monthly report on their departments' work in their mailbox). To regulate the distribution plan, you can either only the administrator or several authorized users. The content, time, frequency, format and even method of dissemination of information can vary according to individual requirements.

Event-driven notifications

The administrator can schedule special notifications that will be sent to users if certain conditions are met. Such a dispatch will shorten the working time spent by the administrator on sending daily messages, as well as on the execution of reports by users and search for information.

The notifications work as follows:

  • The administrator sets the notification condition, during which certain information is sent.
  • Planned tasks stored in the repository are checked with a user-friendly frequency.
  • If a scheduled notification check is detected, then the application tests all conditions (business rules).
  • If the conditions are not met, the information is not sent, and the notification check is postponed for the next scheduled time.
  • If the result of the test is positive, then the corresponding information is generated and sent as a notification. The notification is activated according to the settings (automatically, permanently, once, with a delay).

Notifications serve to inform employees about important events and increase the efficiency of the company. For example, they can help:

  • leader, if the amount of the budget is exceeded;
  • Investor if the stock price falls below a certain level;
  • Trade representative, if a good deal is expected;
  • manager for working with corporate clients, if a large customer receives a complaint.

Splitting reports

Reporting automates the manual process of "paper pushing", as a result, it is possible to significantly reduce administrative costs and processing time. For example, we will present a report that contains information on spent and missed days for each employee. The pages of this report can be automatically broken down and sent electronically to each employee, while many and many hours of HR department work will be saved.

New Technologies for Information or Data Distribution 

In recent years, the main means of disseminating information is the Web, because when it is used:

  • Improves access to business information and increases the number of users who can work with it;
  • provides a convenient interface for searching and navigating;
  • it becomes possible to access different types of data located in different sources (both inside and outside the organization);
  • due to the thin client technology, the hardware and software requirements are reduced and the costs for supporting the client seat are reduced;
  • it is possible to use platform-independent software (Java) and a data representation system (XML);
  • Information for employees, partners and customers can be distributed through Intranet / Extranet and the Internet. Due to this, the space of access to information expands and new ways of doing business are opened.

As a result, the information needed to make decisions can be transferred to users and applications using two main technologies:

  • Web-based tools Business Intelligence (tools for creating reports, performing OLAP analysis and data mining), as well as packages of analytical applications for the Web;
  • corporate information portals and distribution servers.

It is important to note that these technologies are not interchangeable: for maximum effect they should be used together.

Web-oriented BI-tools and analytical software packages distribute information and results of analytical operations through standard graphical and Web-interfaces. Many of these products also support scheduled and event-driven delivery of information to Web servers and mail systems, thereby leveraging all the capabilities of the underlying network tools.

As the volume of business information processed in the decision-making system grows, it becomes more likely that data will be distributed within a series of information stores located on different servers. This is true, for example, in cases where organizations obtain business information from external sources, internal offices, shared software and Web servers.

To help users apply this wide range of business data (both for unregulated access and for offline delivery), the second technology is used - corporate information portals .

When publishing corporate information on the portal, metadata about the location of particular information is used, as well as methods of extracting them. When the employee makes the requested request, the portal reads the relevant metadata and, using the means of access and delivery, selects the necessary information.

Effective work is achieved only if the information is properly shared, otherwise employees duplicate each other's activities and do not achieve mutual understanding. Thus, the portal becomes the basis that guarantees free data sharing and solution to technology problems in business


Rapid development and high competition in the market do not forgive mistakes. In this environment, it is necessary to make decisions correctly and in a timely manner. However, the task of finding the information necessary for this, seemingly simple at first glance, causes a number of difficulties, since the data are accumulated and stored throughout the organization. To solve it, companies need to develop infrastructures that ensure the timely dissemination and sharing of information.

Emerging trends , Data analyst trends, business intelligence future trends are the queries frequently asked to us.The software for dissemination of information should ensure the transfer of data to a huge number of users inside and outside the enterprise and significantly improve the efficiency of work by:

  • rapid delivery of critical information;
  • minimizing the time of the work cycle as a result of stopping the dissemination of information manually, more rational processing of documents and improving the productivity of employees;
  • distribution of any content, including files of different formats (for example, reports from applications of other suppliers or old systems, drawings, diagrams, documents and presentations);
  • optimal storage of files and the realization of a simple and quick extraction of the necessary information.

Future Development of IoT Security | IoT Application and Product Trends

Internet of things

By 2032-2035 IoT-infrastructure will spread to 1 trillion devices and 100 million applications, according to the forecast of the analytical company SMA

$ 11 trillion - in this fantastic amount, McKinsey analysts in June 2015 estimated the potential contribution to the global economy by 2025 of a new technology industry - the Internet of Things (IoT).

Future Development of IoT

Under IoT, experts understand a set of networks consisting of objects that can communicate with each other via IP connection without human intervention. According to McKinsey estimates, the term "Internet of things" is applicable to at least 150 object systems, so the economic effect of it can be so great already in the perspective of the next decade.

future of iot 2018

IoT claims to become a technological shell for almost all markets - from transport, consumer electronics and trade to agriculture, construction and insurance. According to the forecast of BI Intelligence, by 2021 the number of IoT-devices will grow to 22.5 billion from 6.6 billion in 2016. Investments in the Internet of things in this five-year period will reach $ 4.8 trillion, analysts expect. According to estimates of experts of the World Economic Forum, by 2022 the total number of devices connected to the Internet will reach the level of 1 trillion. This will entail both the formation of new markets, and a radical change in a number of traditional businesses, experts warn Davos.

22.5 billion 


An example of the market, formed due to the development of IoT, is the technology of the "smart house". In 2016, the number of consumer "things" included in this "network of networks" has grown to 4 billion, and by 2020 it will reach 13.5 billion, the analytical company Gartner has calculated. Potentially IoT can capture all the "material" environment of the person, experts noted Deloitte: for example, a TV a few years ago no one was associated with online services, and today to present a modern device without the function of Smart TV and access to streaming services like Netflix is ​​almost impossible .The largest technology corporations are busy developing the IoT infrastructure for the "smart house": Apple introduced the HomeKit system in 2014, Google bought the manufacturer of "smart" Nest thermostats for $ 3.2 billion and Amazon developed the ecosystem of the home "manager" Alexa for the third year . According to the estimates of Markets & Markets, the technology market for the "smart house" will increase to $ 122 billion by 2022 from $ 47 billion in 2015.

In more traditional industries, IoT will form whole new segments. In medicine, for example, its size could reach $ 117 billion by 2020, predicted the analytical company MarketResearch. So, on the new technology "electronic" intensive care chambers are based. With the help of high-resolution cameras, the doctor can remotely monitor patients and provide real-time voice and voice advice to colleagues on-site, if emergency measures are required. Telemedicine can accelerate the penetration of quality medical services in developing countries, for example, India (there until 2019 the leading player in the emerging market, Philips plans to equip up to 30,000 hospital beds with cameras).Simpler versions are portable devices that allow monitoring in real time, transfer to a doctor and analyze data on the level of glucose in the blood, remove cardiograms, etc.

IoT mit technology review:

The future of IoT does not look cloudless, analysts are in solidarity: eliminating a person from the communication process with devices carries not only advantages, but also risks. So, IoT-botnet from the devices infected with malicious Mirai, became the reason of serial and powerful attacks in 2016. The goal of Mirai is to constantly scan the Internet for open IoT devices, access to which can be obtained through simple for hacking or default accounts. By compromising the devices, hackers connect them to the botnet for subsequent DDoS attacks.

For the past few years, researchers have been constantly raising the issue of protecting connected devices. Last year, Charlie Miller and Chris Valashek demonstrated how to get wireless access to the critical Jeep Cherokee systems , successfully taking control and forcing it to go out of the way! Vasilios Hiureas from Kaspersky Lab and Thomas Kinsey of Exigent Systems conducted a study of potential weaknesses in the protection of video surveillance systems . Later, one of the manufacturers of medical equipment reported a vulnerability in his insulin pump , which allows attackers to turn off the device or change the dosage of the medicine. There were fears about the items that we use every day, such aschildren's toys , baby monitors and doorbells .

To protect the data with which IoT gadgets work, manufacturers will spend more and more: by 2020, Gartner analysts expect the growth of expenses for providing IoT infrastructure security to $ 30 billion. It's especially important for businesses to treat the spheres in which the Internet of things interacts with sensitive information, for example, data on the state of human health. In the case of access by attackers to these data, even the doctor can not protect the patient: the hacker needs to make adjustments to the software shell of "things" responsible for making decisions, analysts say. As of 2014, up to 70% of IoT solutions were vulnerable, according to a study by Hewlett Packard.


IN 2020

As per Digital Technology Reviews The key problem in IoT is the youth of the industry - the lack of industry standards both in terms of system compatibility, and in terms of providing IS even at the level of development. Plus the illusion that once the device has a small processing power, then it is not interesting to any hackers (but this illusion was very revealingly disposed of by Mirai).

However, according to mit technology review 2018 the prospects for IoT business are not so gloomy, the technology just has not yet passed the "cycle of maturity" according to the methodology of Gartner and has enough time to "fill the bumps", analysts concluded.

The long-term perspective of IoT reviews is a transformation into the "Comprehensive Internet", IDC company noted in a study in 2016. This term hides the system of "close interaction of networks, people, processes, data and objects for the implementation of intellectual interaction and justified actions," experts explained. This transformation will take dozens of years.

Please let us know what are you forcasting about future development of iot or future of iot
, future of iot security , internet of things future applications and internet of things future trends future iot products.

Thursday, May 10, 2018

The Development of Cloud Technologies | Latest TechReview Updates 2018

In 2030, 52% of the world's data will be stored in a public cloud, according to the forecast of VMware

More than 50% of IT budget spending in the world last year for the first time in history had to cloud resources, according to Gartner.

Senior Vice President of Google Diane Green proclaimed 2016 the beginning of the era of "Cloud 2.0". The largest deal of the year in IT is the acquisition for $ 67 billion by Dell Corporation of EMC, the most profitable asset of which is the world's largest developer of virtualization software for VMware.Cloud data center networks technologies trends and challenges By 2018, according to the forecast of Cisco, cloud data centers will handle 78% of all workloads, and 59% of the total cloud load will be the SaaS segment (software as a service). Analysts of Deloitte and TechRepublic regularly point out "cloud" technologies among the directions of the IT market, in which the largest companies will invest most actively in the coming years.

Cloud Big Data Technologies Reviews


Gartner regularly conducts an audit of emerging technologies to determine which of the popular trends may never reach the "maturity cycle", and which - have already overcome all the growing ills and enter a development phase that involves the formation of a new market. "Cloud" technology Gartner attributed to the second group already in 2015: although computing in the cloud has long been not exotic, their age is just beginning.

"We constantly have to revise the estimates upwards. The market is growing faster than we expected back in 2014, "said Dave Bartoletti, an analyst at Forrester Research . According to the forecast of this company, in 2017, the world market of cloud technologies will grow by 22%, to $ 146 billion. For comparison: in 2015, the estimate of Forrester was $ 87 billion, and the forecast for 2020 was $ 236 billion. Segment IaaS (lease infrastructure) in 2017 will grow by 35%, to $ 32 billion. The revenue of the largest player in the segment of the Amazon Web Services division will reach $ 13 billion. It will grow by 2-3 times less on the cloud computing Microsoft Azure, to $ 1 billion - Google Cloud Platform, Forrester believes.

Cloud Big Data Technologies Reviews

The Russian market is still at an early stage of development, but is growing rapidly: the average annual increase to 2020 will be 34.3% and 20% for IaaS and SaaS, respectively, noted the research company iKS-Consulting. According to analysts, in 2015, the volume of the cloud technology market in the country grew by 39.6% compared to 2014, to 27.6 billion rubles, while 22.2 billion rubles. had on SaaS, 4.4 billion rubles. - on IaaS. Bloomberg calls the key trend, the "new norm" of the market in 2017 and the next few years, the transition of companies to a hybrid cloud: the business will no longer be locked into one provider by growing confidence in the clouds and will begin to diversify sources of resources for processing and storing data. Companies will build the process of obtaining additional resources from the cloud in a convenient and fastest way for them, while in what cloud these resources will be "allocated" at this particular moment in time, it does not matter. Gartner expects that by the end of 2017 up to 50% of cloud customers will prefer hybrids.

Why Cloud Technology is Needed ?

Cloud technology will be needed for a number of research areas where huge computing power is required. For example, for developments in the field of machine learning and artificial intelligence . So, Google last year hired Professor Fay-Fei Lee, who headed the Artificial Intelligence Laboratory at Stanford, into a division dedicated to the development of the cloud platform. One of the directions of "democratization" of research in the field of artificial intelligence, Lee, was just the creation of a cloud platform for machine learning.

Another direction is the provision of infrastructure with new robotics . The term "cloud robotics" as early as the beginning of 2010 was born in the depths of Google: it assumes that the "inherent function" of robots, which are increasingly used commercially, should be the ability to constantly access the processing power of data centers for decision-making. For example, an unmanned vehicle, delivering goods on a given route, while traveling on the road remotely reads the data of cartographic, geolocation and other services of the company-developer. The main problem of transition to the cloud infrastructure and, in general, the decentralization of IT functions in companies is security. 57% of companies say that the result of decentralization is the use of unsafe solutions, and 58% - that data protection control systems are insufficient, according to a survey of 3,300 specialists in large and medium-sized companies in 20 countries, conducted in 2016 by order of VMware.

The Development of Cloud Technologies

the construction of new data centers will inevitably require increased attention to cybersecurity, said a free-lance expert at IBM and Dell Kevin Jackson in an interview with Cloud Endure. According to his forecast, the companies will toughen the measures against cyber threats, increase the volume of security tests, improve technologies for secure access , including data encryption. According to a poll by Clutch, 64% of market players consider the cloud a more secure infrastructure. But the majority of respondents still develop a protective arsenal with the help of encryption (61% of respondents), additional identification technologies (52%) and additional tests of cloud systems (48%).

VMware last year agreed on cooperation in the field of data protection in software-defined data centers with Kaspersky Lab. According to Gartner, the budget for cloud cybersecurity in 2017 is ready to increase 80% of market players. At the same time, analysts advise companies to be especially sensitive to training of personnel responsible for communication with the cloud: more than half of the data leakage occurs not because of the vulnerabilities of the provider, but because of customers and their internal problems, states Gartner.


In Russia, 13% of companies have already experienced incidents related to the security of the cloud infrastructure, at least once a year, according to a poll by Kaspersky Lab. At the same time, about a third of companies (32%) lost data as a result of these incidents. But business does not take this threat seriously: only 27% of Russian companies believe that the security of their corporate network depends on the security of their virtual systems and cloud infrastructures. At the same time, more than 80% of cyberinchicants in the clouds will occur due to actions, or rather, the inactivity of the virtual machine owner in the cloud - incorrect configuration of information security solutions inside cloud computing machines, too simple passwords and the work of insiders.

But Russian companies are still more concerned about protecting external cloud services. They worry that incidents can occur with suppliers outsourced to business processes, third-party cloud services, or in the IT infrastructure where the company leases processing power. At the same time, only 15% of companies monitor compliance with security requirements for third parties.

Dataclysm: Who We Are (When We Think No One's Looking)

Big Data: A Revolution That Will Transform How We Live, Work, and Think

In the next topic we are going to cover following queries.
technology and business communication
impact of technology on business pdf
examples of learning technologies
how does technology enhance learning

technology in the workforce

https://www.digitaltechnologyreview.com/ site is a participant in the Amazon Services LLC Associates Program, and we get a commission on purchases made through our links.

Tuesday, May 8, 2018

Quantum Computing 2020 The Development of a Quantum Computers Explained

2020 - the development of a quantum computer. (According to the forecast of Jeremy O'Brien, director of the Center for Quantum Computing at the University of Bristol)

Digital Technology Review | Quantum Computers Explained

"He promises to solve some of the most difficult problems of mankind. He is supported by [the founder of Amazon] Jeff Bezos, NASA and the CIA. Each of its copies costs $ 10 million and operates at a temperature of -273 ° C. No one knows exactly how it works, "- such a teaser in February 2014, Time magazine described the unexpected hero of its cover - the quantum computer D-Wave Two. The flattering words were still an advance: the development of the American company D-Wave, among whose partners - really all leaders of the technology industry, has become an important, but not decisive step towards the invention of a truly revolutionary device.

applications of quantum computing

Quantum Computer to Handle Big Data

A quantum computer is a machine that combines the achievements of computer science and quantum physics - the most complex branch of modern science, which studies elementary particles less than an atom. The physics of these particles often collides with accumulated academic knowledge (for example, contradicts the theory of relativity of Albert Einstein). A quantum particle can simultaneously be in different places and in different states. This principle, which is mutually exclusive from the point of view of logic, is called the principle of superposition.

It is the superposition principle that should form the basis of a full-fledged quantum computer. Unlike a conventional computer that analyzes information via binary code (all data are described as 0 or 1), the device does not work with bits, but with qubits (quantum bit-quantum bits) that can simultaneously be in positions 0 and 1. Quantum computer thanks to this technology, processes data many times faster than a conventional analog, and also opens the way for mankind to solve problems that are simply inaccessible today.

The ability to handle large amounts of data and huge speed are properties that can help states and businesses solve many problems. Quantum computers are machines that are so necessary in the coming era of large data. Any task - for example, how a person gets from point A to point B, - an ordinary computer will decide, in turn, analyzing all available options. A quantum computer will analyze all the routes simultaneously and at times more quickly offer the optimal solution.

The Application of a Quantum Computer

The use of a quantum computer should help to make a breakthrough in a wide range of areas, for example, in predicting weather conditions (building accurate climate models will allow, for example, to improve the work of autopilots in aviation) and developing new drugs (through trillions of combinations of molecules and the detection of effective drugs to combat cancer, for example). The same applies to the invention of this artificial intelligence: on quantum computers, which are constantly cultivated through machine learning algorithms , the developers of artificial intelligence have high hopes.

Today, the creation of quantum computers involved teams that can attract large-scale investments in similar projects from states and technology corporations. For example, from the documents published by Edward Snowden, it followed that the US National Security Agency invested about $ 80 million in the development of a quantum computer in order to create new encryption mechanisms.

D-Wave Systems supports Google, NASA and other companies and departments, but apart from its projects, IBM and Microsoft are also singled out.

100 million

D-Wave, who claims the invention of the "first commercial" quantum processor, in September 2016 introduced a new device, with a chip for a record 2,000 qubits. A few months earlier, the Lockheed Martin concern, up to 1.1 thousand qubits, had increased the performance of its own Center for Quantum Computing (QCW), also based on the development of D-Wave. Not all qubits are used by the device in computing processes - by analogy with RAM, which is not fully used on conventional computers or smartphones: the same Lockheed Martin recognized that of 1,152 qubits quantum problems are solved by 1098. D-Wave processors are not universal quantum machines, and produce quantum calculations based on information loaded into them by researchers. That is, while D-Wave can solve only a narrow list of typical tasks.

IBM in May 2016 introduced a cloud-based service for quantum computing IBM Quantum Experience. In comparison with the figures of D-Wave, the system's power is less - only 5 qubits, but IBM calls the breakthrough its own development: researchers can already connect to the service around the world. For example, the company demonstrated the ability of its computer to work with the Grover algorithm: if a conventional device requires one to four attempts to detect in a deck of four cards, the quantum one is enough.

Microsoft from the mid-2000s invested in research on quantum computing and in November 2016 announced the creation of a unit to develop an innovative computer. According to The New York Times, Microsoft is ready to spend tens of millions of dollars developing and shaking D-Wave's position in the emerging market.

According to market research company Market Research Media, by 2020 the market volume will grow to $ 5 billion, and after the development of technology can occur a sharp leap that will put the classical computer industry on the brink of extinction.

At the same time, many experts remain skeptical about the prospects of quantum computers. A universal machine that would replace today's personal computers due to unattainable power, none of the companies has yet announced what the developers themselves admit. At what point humanity will accumulate the amount of knowledge to create a full-fledged revolutionary device, no one knows: the same Google noted that joint developments with D-Wave are at the "earliest stages". It is also important to avoid errors in quantum computation due to environmental effects - the process of correcting them is much more complicated than in classical calculations. To eliminate such errors in the future can go up to 99% of the power of quantum processors, but the remaining percentage is enough for a revolution in technology,


Finally, technology also has risks associated with cyber security. 

As the co-founder of the Institute of Quantum Computing at the University of Waterloo, Michael Moska, predicted in the column for the Global Risk Institute, the probability of cracking the main encryption tools used by hackers today will grow to 1 chance from 7 by 2026 and up to 50% by 2031.

Modern systems usually encrypt all data using a secret key and a symmetric algorithm, according to this principle, certificates of websites, digital signatures of applications and encrypted information exchange in Internet banks, instant messengers, etc. are arranged. The key is the same for the sender and receiver (hence the name "Symmetric"), it is set at the beginning of the session with the help of a second, asymmetric cryptosystem. Asymmetric algorithm is a computationally complex task, therefore it is used only to transmit a secret key. Even if a spy or a hacker can intercept a message with an asymmetric algorithm, decryption with modern computing capacities will take tens to millions of years, depending on the length of the key. Quantum computer for this decryption will take about the same time as usual - for encryption.

This risk in the future may encourage states to adjust the industry, limit the development of technology beyond the control of the authorities, said managing director of analytical company Guggenheim Partners Marcos Lopez de Prado in an interview with BBC.

Following questions are asked via email by our readers.We shall try to answer them in next article. 

Popular Posts