Curated for content, computing, and digital experience professionals

Category: Computing & data (Page 69 of 80)

Computing and data is a broad category. Our coverage of computing is largely limited to software, and we are mostly focused on unstructured data, semi-structured data, or mixed data that includes structured data.

Topics include computing platforms, analytics, data science, data modeling, database technologies, machine learning / AI, Internet of Things (IoT), blockchain, augmented reality, bots, programming languages, natural language processing applications such as machine translation, and knowledge graphs.

Related categories: Semantic technologies, Web technologies & information standards, and Internet and platforms.

Metadata

The term metadata refers to “data about data”, but both uses of “data” in practice are loose, in that they can refer to structured, unstructured, or semi-structured data, and can be descriptive or prescriptive. Metadata can also refer to physical objects.

Metadata is especially useful for creating, managing, publishing, categorizing, searching, and enhancing digital information. See the Wikipedia page on the Dublin Core for a good description.

https://en.wikipedia.org/wiki/Dublin_Core

 

computer aided translation

Now more commonly known as machine translation (MT), refers to the the use of software to translate text or speech from one language to another. In the 80s and 90s MT software was rule-based, but in the 2000s statistical analysis and the re-emergence of neural networking and more advanced machine learning techniques have proved to be far more successful.

 

Categorization

Categorization is the process in which ideas and objects are recognized, differentiated, and understood. Categorization implies that objects are grouped into categories, usually for some specific purpose. Ideally, a category illuminates a relationship between the subjects and objects of knowledge. Categorization is fundamental in language, prediction, inference, decision making and in all kinds of environmental interaction.

Augmented reality

Augmented reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented by computer-generated sensory input such as sound, video, graphics or GPS data. It is related to a more general concept called mediated reality, in which a view of reality is modified (possibly even diminished rather than augmented) by a computer. As a result, the technology functions by enhancing one’s current perception of reality.

Information technology

“Information technology” (IT) likely first appeared in a Harvard Business Review article in November 1958, and refers to the use of computing technology to create, process, manage, store, retrieve, share, and distribute information (data).

Early use of the term did not discriminate between types of information or data, but in practice, until the late 1970s, business applications were limited to structured data that could be managed by information systems based on hierarchical and then relational databases. Also see content technology and unstructured data.

Gilbane Advisor 10-22-19 — Interoperability, ambient computing, CCPA

Microsoft puzzling announcements

Jean-Louis Gassée has some good questions, including… “Is Microsoft trying to implement a 21st century version of its old Embrace and Extend maneuver — on Google’s devices and collaboration software this time?” Read More

Microsoft Duo

Integrated innovation and the rise of complexity

While Stephen O’Grady’ post isn’t addressing Microsoft’s recent Surface announcements as Gassée was, it is an interesting companion, or standalone read. Read More

Google and ambient computing

‘Ambient computing’ has mostly been associated with the Internet of Things (IoT). There are many types of computing things. But the most important, from a world domination perspective, are those at the center of (still human) experience and decision-making; that is mobile (and still desktop) computing devices. The biggest challenge is the interoperability required at scale. This is fundamental to computing platform growth and competitive strategies (see Gassée’s question above). Ben Thompson analyzes Google recent announcements in this context. Read More

Attention marketers: in 12 weeks, the CCPA will be the national data privacy standard. Here’s why

Now it’s 10 weeks. Tim Walters makes a good case for his prediction even though other states are working on their own legislation, and Nevada has a policy already in effect. Read More

Also…

The Gilbane Advisor curates content for content, computing, and digital experience professionals. We focus on strategic technologies. We publish more or less twice a month except for August and December.

Internet of Things

The Internet of Things refers to uniquely identifiable objects (things) and their virtual representations in an Internet-like structure. The term Internet of Things was first used by Kevin Ashton in 1999. The concept of the Internet of Things first became popular through the Auto-ID Center and related market analysts publications. Radio-frequency identification is often seen as a prerequisite for the Internet of Things.

« Older posts Newer posts »

© 2024 The Gilbane Advisor

Theme by Anders NorenUp ↑