The Problem with Microsoft’s OpenAI Investment? Simple

There is nothing resembling a consensus in the AI research community regarding how or when AGI will be achieved.

Getting a billion-dollar commitment from Microsoft surely was not easy, but creating a decentralized AI, data and processing ecosystem is most likely a dramatically more difficult task.

Cutting-edge AI R&D now requires large amounts of computing power, and leveraging all this compute power requires expert support staff beyond just an AI research team.

Some leading AI researchers are skeptical that current deep learning technology is the right direction to follow to achieve AGI.

OpenAI, like Google Deep Mind and Facebook AI Research Lab — all of which are doing AGI-oriented research together with their immediately practical AI technology development — stands firmly in the deep learning camp.

But OpenAI’s basic AI technology orientation is about the same as Google Deep Mind, Google Brain, Facebook AI Research, Tencent, Alibaba or Baidu.

Those of us in the small but rapidly growing decentralized AI space are actively pushing back against the increasing centralization of resources that the Microsoft-ization of OpenAI exemplifies.

This article was summarized automatically with AI / Article-Σ ™/ BuildR BOT™.

Original link

Natural Language Processing vs Natural Language Understanding

In this article, we will discuss the differences between natural language processing (NLP) and natural language understanding (NLU).

Natural Language Understanding (NLU)Natural Language Understanding (NLU) or Natural Language Interpretation (NLI), on the other hand, is the subtopic of NLP which deals with machine reading comprehension by breaking the elemental pieces of speech.

Although the terms natural language processing (NLP) and natural language understanding (NLU) are closely related to each other and sometimes they are used interchangeably, these two concepts are different from each other with a little layer of similarity.

With the help of artificial intelligence and computational linguistics, machines are able to learn the natural language used by humans in a more efficient way.

NLU being the subtopic of NLP, it helps in processing the human-like understanding from the unstructured data I.E.

ArchitectureIn NLP, in terms of the pipeline, the overall architecture can either be batch processing or real-time processing architecture.

NLU communicates with the untrained and unstructured data in order to understand their insights and meanings, I.E.

This article was summarized automatically with AI / Article-Σ ™/ BuildR BOT™.

Original link

ottokart/punctuator2: A bidirectional recurrent neural network model with attention mechanism for restoring missing punctuation in unsegmented text

DEMO and DEMO2A bidirectional recurrent neural network model with attention mechanism for restoring missing inter-word punctuation in unsegmented text.

The model can be trained in two stages (second stage is optional):First stage is trained on punctuation annotated text.

Second stage with pause durations can be used for example for restoring punctuation in automatic speech recognition system output.

Optional second stage can be trained on punctuation and pause annotated text.

In this stage the model learns to combine pause durations with textual features and adapts to the target domain.

Training speed with default settings, an optimal Theano installation and a modern GPU should be around 10000 words per second.

Example: to be ,COMMA or not to be ,COMMA that is the question .PERIOD(Optional) Pause annotated text files for training and validation of the second phase model.

This article was summarized automatically with AI / Article-Σ ™/ BuildR BOT™.

Original link