OpenAI release new features for DIY GPT’s

ChatGPT developer Open AI have introduced a slew of new features to their platform including faster (and cheaper!) versions of ChatGPT and DALL-E as well as a platform for building DIY AI Assistants and DIY versions of LLM apps known as GPT’s. Open AI is clearly looking to the financial success of the Apple Appstore and is setting up allowing custom GPT’s to be monetised / resold on the OpenAI platform.

These releases have been greeted with enthusiasm by some developers whilst others could see their current business offerings wiped out at a stroke by this news.

What doesn’t seem to be changing in this space is the dizzying speed of change and Open AI’s CEO Sam Altman playfully quipped that however impressed we may all be by this release it will “seem quaint” when compared to what they have in the works planned for next year.

A sober reminder reminder that those entering this space early may invest heavily in building services and features that Open AI could be giving away for free next year (next month!).

To borrow from the latin phrase Caveat Emptor (buyer beware) the businesses springing up daily around OpenAI may wish to CAVEAT MERCATOR (Merchant beware).

UK Gov calls for greater use of facial recognition.

The UK government is encouraging police to expand their use of retrospective facial recognition (RFR) software to identify offenders. The policing minister has recommended over 200,000 image searches against the Police National Database within six months. This push, along with the drive for more live facial recognition cameras, has sparked concerns about civil liberties and increased surveillance.

Facial recognition technology, which can identify individuals even when part of the face is hidden, is seen by some as a valuable tool for law enforcement. Critics, however, argue for its cessation due to concerns about unchecked use, lack of parliamentary approval, and potential data protection and human rights violations.

The Home Office insists that facial recognition use is strictly regulated and only used for legitimate policing purposes. They claim that AI-driven surveillance can help identify individuals wanted for serious crimes and find missing persons, improving police efficiency and community presence.

While the Home Office has safeguards, including the immediate deletion of unmatched data, controversies persist. The data regulator, the ICO, has warned about facial recognition use, and wrongful apprehensions have occurred during trials. Moreover, Cambridge University researchers have questioned the ethics and legality of facial recognition use by police.

The post UK Gov calls for greater use of facial recognition. appeared first on Web Science Trust.

X wants us to Grok …

Elon Musk’s xAI (nee Twitter) have just released a beta product named  Grok (named from Sci-Fi author Robert Heinlein’s cyber-slang) meaning “to understand”

Grok is based on large language models (LLM’s) like ChatGPT and has been trained on millions of articles from the Web. Where it differs is in retrieving up-to-date facts directly in real time from the X platform (amongst other sources) as a method distinct from updating ChatGPT using so-called RAG files (retrieval augmented generation). For such a new and relatively young project, Grok seems to be fairing well in terms of performance scores beating/equaling many of the established closed- and open source products (including ChatGPT 3) whilst falling short of the performance of  ChatGPT 4

Also distinct is its Musk-like penchant for sarcasm which it uses to side-step what X are calling “spicy” prompts which many other GPT-variants either ignore or to which they generally give a fairly weak “I don’t know” or some other  kill-joy response.

Musk’s new platform will be available soon to X-premium subscribers and whilst it won’t actually tell you how to cook crystal meth it will at least try amuse you in the process of NOT telling you how to cook it.