Predictive modeling is always a fun task. The major time spent is to understand what the business needs and then frame your problem. The next step is to tailor the solution to the needs. As we solve many problems, we understand that a framework can be used to build our first cut models. Not only this framework gives you faster results, it also helps you to plan for next steps based on the results.
There is a book ‘Automate boring stuff with python’ and with this book, I have started my ‘Python Programming’ journey from scratch. And I fall in love with this programming language.
Artificial Intelligence (AI) and Machine Learning (ML) are the new black of the IT industry. While discussions over safety of its development keep escalating, developers expand abilities and capacity of artificial intellect. Today Artificial Intelligence went far beyond science fiction idea. It became the necessity. Being widely used for processing and analyzing huge volumes of data, AI helps to handle the work that cannot be done manually anymore because of its significantly increased volumes and intensity.
The most widely used technique in NLP is sentiment analysis. Sentiment analysis is most useful in cases such as customer surveys, reviews and social media comments where people express their opinions and feedback. The simplest output of sentiment analysis is a 3-point scale: positive/negative/neutral. In more complex cases the output can be a numeric score that can be bucketed into as many categories as required.
In the case of our text snippet, the customer clearly expresses different sentiments in various parts of the text. Because of this, the output is not very useful. Instead, we can find the sentiment of each sentence and separate out the negative and positive parts of the review. Sentiment score can also help us pick out the most negative and positive parts of the review:
An NLP model with over one trillion parameters will be built.
In 2019 OpenAI published GPT-2, the first NLP model with over 1 billion parameters (it had 1.5 billion). At the time this was seen as staggeringly large. In 2020 OpenAI dropped GPT-3 on the world, which had a whopping 175 billion parameters.
The transformer “arms race” will continue in 2021 with the publication of the first model with over 1 trillion parameters. Most likely this model will come from OpenAI and be named GPT-4. Other organizations that might break the trillion-parameter-model mark include Microsoft, NVIDIA, Facebook and Google.
The “MLOps” category will begin to undergo significant market consolidation.
A spate of startups building tools and infrastructure for machine learning has emerged in recent years. Relatively few of these “AI picks and shovels” startups will survive as large standalone companies. Meaningful consolidation will begin to take place in this category in 2021.
Startups building specialized “point solutions” will be scooped up by larger players seeking to develop comprehensive, end-to-end model development platforms. Intel’s dual acquisitions of SigOpt and Cnvrg.io this year are canaries in the coalmine.
Likely acquisition targets: Alectio, Algorithmia, Arize AI, Arthur AI, Comet, DarwinAI, Fiddler Labs, Gradio, OctoML, Paperspace, Snorkel AI, Truera, Verta, Weights & Biases, et al.
Likely acquirors: IBM, Microsoft, Amazon, Databricks, DataRobot, Oracle