Machine translation, sometimes referred to by the abbreviation MT (not to be confused with computer-aided translation, machine-aided human translation or interactive translation) is a sub-field of computational linguistics that investigates the use of software to translate text or speech from one natural language to another.
In the 80s and 90s MT software was rule-based, but in the 2000s statistical analysis and the re-emergence of neural networking and more advanced machine learning techniques have proved to be far more successful.
Artificial General Intelligence (AGI) refers to a type of artificial intelligence that is at least as capable as human intelligence. As powerful as machine learning has become with neural networking and deep learning techniques, it does not approach AGI, and when, or even if, it will is controversial.
A NoSQL (originally referring to “non SQL”, “non relational” or “not only SQL“) database provides a mechanism for storage and retrieval of data which is modeled in means other than the tabular relations used in relational databases. Such databases have existed since the late 1960s, and include document databases, and XML databases, which along with object-oriented databases (OODB), focus on managing unstructured data or semi-structured data. “NoSQL” became popular in the early twenty-first century, triggered by the needs of Web 2.0 companies such as Facebook, Google, and Amazon.com. NoSQL databases are increasingly used in big data and real-time web applications. NoSQL systems are also sometimes called “Not only SQL” to emphasize that they may support SQL-like query languages.
In computing, Master Data Management (MDM) comprises a set of processes, governance, policies, standards and tools that consistently defines and manages the master data of an organization (which may include reference data). An MDM tool can be used to support Master Data Management by removing duplicates, standardizing data (Mass Maintaining), incorporating rules to eliminate incorrect data from entering the system in order to create an authoritative source of master data.
Now more commonly known as machine translation (MT), refers to the the use of software to translate text or speech from one language to another. In the 80s and 90s MT software was rule-based, but in the 2000s statistical analysis and the re-emergence of neural networking and more advanced machine learning techniques have proved to be far more successful.
Natural language processing (NLP) is a subfield of linguistics, computer science, information engineering, and artificial intelligence concerned with the interactions between computers and human (natural) languages, in particular how to program computers to process and analyze large amounts of natural language data.
“Information technology” (IT) likely first appeared in a Harvard Business Review article in November 1958, and refers to the use of computing technology to create, process, manage, store, retrieve, share, and distribute information (data).
Early use of the term did not discriminate between types of information or data, but in practice, until the late 1970s, business applications were limited to structured data that could be managed by information systems based on hierarchical and then relational databases. Also see content technology and unstructured data.
Microsoft puzzling announcements
Jean-Louis Gassée has some good questions, including… “Is Microsoft trying to implement a 21st century version of its old Embrace and Extend maneuver — on Google’s devices and collaboration software this time?” Read More
Integrated innovation and the rise of complexity
While Stephen O’Grady’ post isn’t addressing Microsoft‘s recent Surface announcements as Gassée was, it is an interesting companion, or standalone read. Read More
Google and ambient computing
‘Ambient computing‘ has mostly been associated with the Internet of Things (IoT). There are many types of computing things. But the most important, from a world domination perspective, are those at the center of (still human) experience and decision-making; that is mobile (and still desktop) computing devices. The biggest challenge is the interoperability required at scale. This is fundamental to computing platform growth and competitive strategies (see Gassée’s question above). Ben Thompson analyzes Google recent announcements in this context. Read More
Attention marketers: in 12 weeks, the CCPA will be the national data privacy standard. Here’s why
Now it’s 10 weeks. Tim Walters makes a good case for his prediction even though other states are working on their own legislation, and Nevada has a policy already in effect. Read More
The Gilbane Advisor curates content for content, computing, and digital experience professionals. We focus on strategic technologies. We publish more or less twice a month except for August and December.