Browse Definitions by Topic

Artificial Intelligence

Artificial Intelligence (AI), sometimes referred to as machine intelligence, is the technology of creating intelligent (thinking) computer systems.


The basic properties of AI are:

  • language understanding;
  • learning;
  • ability to make decisions and act.

AI is associated with using computers to understand human intelligence, but is not limited to biologically plausible methods.


This science is connected with psychology, neurophysiology, transhumanism and others. Like all computer science, it uses a mathematical apparatus. Philosophy and robotics are of particular importance.

Artificial Intelligence is a top popular scientific area, the foundation to which was laid in 1956. Now it’s growing rapidly, having the evangelists in each progressive enterprise nearly in every country.


AI is a complex of related technologies and processes that develop qualitatively and rapidly, for example:


  • natural language processing
  • machine learning
  • expert systems
  • virtual agents (chatbots and virtual helpers)
  • recommendation systems.

AI Applications

The fields of AI app are quite wide in scope and encompass both familiar technologies and emerging new directions that are far from mass market use, in other words, a whole range of solutions, which span from robotic vacuum cleaners to complex space stations.


Artificial Intelligence is not a monolithic subject area. Moreover, some technological directions of AI appear as new sub-sectors of the economy and separate entities, at the same time serving most areas in the economy.

The Goal of Artificial Intelligence

The traditional goals of AI include  knowledge representation, reasoning, planning, natural language processing,  learning, perception and the ability to manipulate objects.

Types of AI

  • Limited Memory AI.
  • Reactive Machines AI.
  • Self-aware AI.
  • Theory of Mind AI.
  • Artificial General Intelligence (AGI)
  • Artificial Narrow Intelligence (ANI)
  • Artificial Superhuman Intelligence (ASI)

More detailed information about these types are available here.

AI Perspectives

There are two directions for the AI development:

  • To solve the problems associated with the approach of specialized AI systems to human capabilities and their integration, which is realised by human nature.
  • To create an Artificial Mind, representing the integration of already created AI systems into a single system that can solve the problems of mankind.

The AI development leads to the adaptation of technologies in classical sectors of the economy, leading to algorithmization of almost the entire functionality of routine operations. Business value of the AI lie in the optimisation and automation of routine processes, which can be implemented via machines in every sphere, from customer support (chatbots and AI adviser) to food processing and construction (3D printers), decreasing the costs and speeding the achievements of results.

Additional Terms
Alpha software
is computer software in the early testing phase. It has basic functions enough to be used, but it is often buggy and lacks features that will be integrated into the final version. Alpha software is often used for internal testing. Internal (Alpha) testing - The stage of testing the program as a whole by testers, usually not developers of the software product, but, as a rule, within the organization or the community developing the product. Also, it can be the stage of adding new functionality, whereby programs can only be used to familiarize themselves with future opportunities. Most often, alpha testing is carried out at an early stage of software development, but in some cases, it can be used for a finished product as an internal acceptance test. Sometimes alpha testing is performed under a debugger or using an integrated development environment that helps to quickly identify the errors found. The detected bugs can be reported to testers for additional investigation in an environment similar to the one in which the program will be used. Typically, alpha testing ends with freezing properties and goes into beta testing. Beta testing - The stage of active public testing and debugging of the program that passed the alpha testing (if any). Programs of this level can be used by other developers of software for compatibility testing. Nevertheless, the programs of this stage can still contain a large number of errors. Sometimes beta testing is performed in order to get feedback about the product from its future users. Often for free and open source software, the alpha testing phase is characterized by the functional content of the code, and beta testing is the error correction stage. At the same time, as a rule, at each stage of development, intermediate results of work are available to end users.
>> Bespoke Software
Bespoke software are programs tailor-made to individual customers’ requirements, and are thus wholly unique, as opposed to standard software that is developed and sold to customers as it is (without special features designed to meet particular needs). Bespoke software development is a service delivery, which consists of providing the necessary technical expertise and manpower. Functionalities, delivery schedule and terms of payment are subject to a contract between a service provider and a customer. The customer is heavily involved in the development process and estimates the success of the work. Almost all bespoke systems are application software, the implementation of which demands that the operating system be preloaded onto the user’s PC. The construction of bespoke software has been practised since the 1960s and was initially the only means of obtaining application software. What is in Essence? Standard software often responds to a limited, or insufficient, number of requirements, so bespoke software is usually ordered when there is no equivalent standard software available, i.e. in highly specialised areas. It can also be created in order to bring together disparate products; a common practice with software suites such as ERP and CRM. Bespoke Software Development in Brief Software development is performed gradually in several phases, or milestones: at the end of each phase a client receives a version of the product. Each phase ends with an acceptance testing, whereby he or she verifies that the software is doing what should be expected of it. The software is then tested in many conditions, with real data, possibly accompanied by stress tests designed to make the software fail and see how it rescues itself and returns to normal. Planning in several phases makes it possible to take into account the evolution of the customer's expectations: termination of a phase can inspire the customer, leading them to ask for a more refined product. Payment can be made on an hourly basis – regular payment of the developer's working hours, or fixed price – a fixed price is negotiated at the conclusion of the contract and typically paid in several parts. The fee is between $50 and $300 per hour, depending on the IT vendor. The scope of work required depends on the amount of source code and specifications, and may in some cases exceed a year. In the case of fixed payments, the evolution of the customer's requests may lead to a renegotiation of both the contract and the end cost. Bespoke software is built with special development tools, and the marketing process is very different from that of standard software: the software is considered a project. It is often created from scratch and is therefore not immediately available. The customer is strongly involved in the development work and the geographical proximity between the customer and the supplier counts. The risk of commercial failure is taken by the customer. The acquisition cost is high because it is fully paid by a single client. The ownership of the software and the licence conditions are one of the subjects of the contract signed between the supplier and the customer.
>> Business Intelligence
(BI) is a technology of extracting information, gathering and presenting knowledge, using a variety of hardware and software technologies, to help executives and business owners make decisions. Such an approach enables organisations to transform data into structured information. BI Characteristics At the core of BI technology is the organised analysis of the end-user. BI generates an iterative process of a typical business user behaviour. Through mathematical modeling, it forms conclusions that effectively change the company in a positive way. BI has a wide range of users in the enterprise, including executives and analysts. Business intelligence product classification Today the categories of BI products include: Business intelligence tools; Business intelligence applications. BI tools, in turn, are divided into: query and report generators; developed BI-tools, tools of online analytical processing (OLAP); enterprise BI suites ( EBIS); BI platforms. The main part of the BI-tools is divided into corporate sets and platforms. Tools for generating queries and reports are largely absorbed and replaced by corporate BI-sets. Multidimensional OLAP-mechanisms or servers, as well as relational OLAP-mechanisms are tools and infrastructure for BI-platforms. Most of these digital systems are exploited by end users for accessing, analyzing and generating reports on data. App developers use these platforms to create and deploy applications that are not considered as BI tools.
>>
Additional Terms of Software development
See more words
Net Promoter Score (NPS)
Net Promoter Score (NPS) is an index that identifies customer loyalty to a product or company and is used to assess readiness for re-purchases. How It Works Measuring the NPS loyalty index involves several steps: Consumers are asked to answer the question “What is the probability that you would recommend a company/product/brand to your friends/acquaintances/colleagues?” On a 10-point scale, where 0 corresponds to the answer “I will not recommend it in any way”, and 10 - “ I will surely recommend. " Based on the estimates obtained, all consumers are divided into 3 groups: 9-10 points - product/brand promoters, 7-8 points - passives, 0-6 points - detractors. Calculation of the NPS index itself. NPS =% supporters -% critics As a result, the the user’s loyalty score calculated on the scale from -100 to 100. If all the customers are willing to recommend the product, the score will be about 90-100, if they are not willing to recommend it - the NPS will drop to -90-100 points.   NPS trade mark was registered for the marketing tool, which automates the calculation of the above mentioned data. History Frederick Reichheld is considered the founder of the method, who first announced the method in the article “One Number You Need to Grow”, published in the Harvard Business Review in December 2003. In 2006, he released a book entitled “The Ultimate Question: Driving Good Profits and True Growth”. He continued his arguments on the loyalty, profitability and growth of the company. In 2001, Reichheld conducted research in more than 400 American companies, where the main task was to measure the influence of customer loyalty (measured by NPS) on its growth rate. The main result was the conclusion that the average NPS by market in the industry was 16%, but for companies such as eBay and Amazon NPS it was 75%. Reichheld does not say that communication is present everywhere: it is absent altogether in monopolistic markets. However, industries such as passenger air travel, insurance and car rental have become a prime example of interconnection. This is obvious, since these companies are service providers, where customer satisfaction and loyalty depend on the level of customer service. As a result, many companies have become adherents of this technology, including Apple, American Express,  eBay, Amazon, Allianz, P & G, Intuit,, Philips, etc. For certain industries, especially software, it has been proven that Detractors often stay with the company while Passives leave.  This seems to be a relatively high barrier to trade. Faced with criticism of the promoter's score, proponents of the network promoter's approach stated that the proposed statistical analysis only proved that the "recommendation" problem was similar to other indicators in predictive capacity, but failed to solve the real problem and this is the core of the argument presented by Reichheld. Proponents of the method also argue that third-party data analysis is not as good as analyzing the company in its own set of customers, and the actual benefits of the method (simple communication concepts, short survey, customer follow-up features ) exceed any statistical disadvantage of the approach. They also allow inquiries using any other issues to be used in the net promotion system, as long as it meets the criteria to securely classify customers as promoters, passives and detractors.
>> 5G
is the fifth generation mobile communication technology based on the IMT-2020 standard. The speed of Internet access in the 5G network is predicted at around 10 Gbit/s. 5G reduces the signal delay to one millisecond - against 10 milliseconds on 4G networks and 100 milliseconds in 3G. New generations of mobile communication appear every 10 years. Within this interim, time is spent on the development of technology, standards and infrastructure upgrades. It is expected that the 5G network capacity will be enough to serve more than 1 million devices per 1 km² at an average speed of 100 Mbps. Read more about 5G opportunities for business. Who Deals with 5G Networks in the World Today 5G technologies are used by: research laboratories (for example, the 5G Lab Germany laboratory at the Dresden Technical University); mobile operators (British Vodafone, American Verizon and AT&T, Japanese NTT DoCoMo, Swedish Teliaetc); telecom equipment suppliers Swedish Ericsson, (Chinese Huawei, Finnish Nokia, South Korean Samsung, etc.). 5G Applications These apps and services require significantly higher characteristics of a mobile Internet connection, which cannot be implemented in existing commercial LTE networks. It is expected that 5G networks will allow connecting many devices capable of establishing billions of connections, due to which it will be possible to create new services in: Tactile Internet (transmission of touch), IT and Telecom, automotive industry - self-driving cars, entertainment industry, education, agriculture and many others. Due to the 5G networks, it will also be possible to improve the quality of use of already existing services, where large volumes of traffic are involved. Launch of the World's First 5G Network October 1, 2018 Verizon announced the launch of the world's first commercial network of the fifth generation (5G). The operator has deployed it in four US cities: Sacramento, Houston, Los Angeles, and Indianapolis. The company officially declared Houston resident Clayton Harris "the first customer of the 5G network in the world,” which provides an average speed of 300 Mbit/s, and the maximum of 940 Mbit/s.
>> Node.js
is a server platform for working with JavaScript through the V8 engine. JavaScript performs the action on the client side, and Node let the commands, written on JS to be implemented on the server. With Node, front-end programmers can write full-fledged software applications. Node can call commands from JavaScript code, work with external libraries, and act as a web server. Node Advantages Node is easier to scale. When thousands of users connect to the server at the same time, Node works asynchronously, that is, it sets priorities and allocates resources more intelligently. Java, for example, allocates a separate stream for each connection. Features Asynchronous scripts based on events. All Node.js APIs are asynchronous: non-blocking downloads. In essence, this means that a Node based server never expects data to be returned from the API. After the call, the server proceeds to the next API, and the Node.js events notification mechanism helps the server to get a response from the previous call. Very fast. Being built on the Google Chrome V8 JavaScript browser, the Node.js library runs very quickly in code. Single-threaded but easily scalable - Node.js uses a single-threaded model with an event loop. The Event engine helps the server respond in a non blocking way and provides high scalability, unlike traditional servers that create limited threads for processing requests. Node uses a single-threaded program, and the same program can serve much more requests than traditional servers, such as the Apache HTTP Server. No buffering - Node.js apps do not buffer data. Apps simply output data in parts. Where is Node.js used? Node.js has established itself as an ideal technological solution in the following areas: Input / Output applications Streaming apps Intensive use of data in real time (DIRT) JSON API based applications Node is successfully used by such large companies as eBay,Microsoft, PayPal, General Electric, Uber,  GoDaddy, Wikipins, Yahoo!. Read how we build great apps with Node.js.
>>
View all IT-related terms
Results for "DEV"
Logo Magora LTD
close
Get in touch
Do you agree to the personal data processing?

Logo Magora LTD
close
Thank you very much.

Your registration to the webinar on the 27th of September at 2 p.m. BST was successfuly completed.
We will send you a reminder on the day before the event.
Magora team
Registration for a webinar

"Let Smart Bots Speed up your Business"
Date: 27.09.2018 Time: 2 p.m. BST
Do you agree to the personal data processing?