August 4, 2023

ChatGPT & AI: Actions For Businesses Who Have Decided Not To Build Something In House

A few months ago I thought a majority of companies would end up with proprietary Large Language Models (LLM) which they had fine tuned to give them company specific context and knowledge. I wasn’t alone in this. The benefits to a model that knows how things are done “around here” are potentially massive and lots of providers had grown up, making adapting existing models as easy as possible. Professional services companies with lots of historical data were keen to use it. Now, in Summer 2023, it feels as if lots of these efforts have been slowed by challenges familiar to corporate technology projects: incomplete data & requirements as well as a preference for a fully productised wrapper which takes time and resources to build. 

The biggest change, however, has been how quickly existing service providers have managed to integrate high quality Generative AI tools into existing software. Incumbents have always sought to make the new thing a feature, rather than a new product. The relative ease of building Generative AI systems (which was apparent even in Spring) has made it easier for Microsoft, Google, Adobe and others to release suites of new features which are so good that they have taken some of the wind out of the DIY sails - as well as steamrolling many function-specific startups in the process. 

We now know that Google’s proprietary LLM has been integrated into Search for several years, likely powering the impressive autocomplete functions. On opening the Google Doc where I wrote this I was offered the chance to immediately include a template for meeting notes, an email or a CV. Photoshop has been quick to integrate image generation and editing capabilities that are only a matter of weeks behind cutting edge services from Midjourney and Stability. Perhaps most significantly, Microsoft, who still dominate the enterprise office software market, have announced plans to roll Copilot out broadly through their products. Initially a tool for helping write computer code, Copilot will (once widely available) help with a huge array of tasks in Word, Excel and PowerPoint. 

This leaves the companies that we speak to asking variants of the same question: I have decided I am not going to build anything in house, so, what should I do? 

We believe there are 4 key priorities for organisations, whether private or public, in this position. 

  1. Set policies on using existing tools and make sure that your employees understand them. Whilst OpenAI (the makers of ChatGPT) have recently clarified their position on training on data that is submitted to them, many organisations remain uncomfortable with proprietary data being shared via a box on the internet. If you haven’t taken steps to prevent this then it is a likely bet that employees are already using these tools for some, or much, of their daily work. Beyond information security, there is a trend for AI generated content to home in on the middle of the road and employees need to be encouraged to use these tools as a starting point, rather than finished product, or see quality suffer. Much of government has decided to block access until information safeguarding and retention requirements can be met. Nevertheless, many employees are using these tools valuably for tasks which present no privacy or commercial risk and are consequently increasing their productivity. Organisations need to take a clear decision about their risk appetite and ensure that everyone is obeying the policy. As a first step, senior managers need to understand how their teams are using these tools already. 
  2. Get information from front line staff about how generative tools are helping them. The flipside of the point above is to make sure that where ChatGPT or other tools are making meaningful differences to workflows this information should be captured as best practice and shared. A leaked Amazon memo showed that the leadership team surveyed every team in the company to source ideas for useful Generative AI tools to build. This needn’t just apply to ways of using ChatGPT. Generative tools ranging from image editors, Pdf Q&A chatbots and proposal writers are now widely available both free and as subscription products. Organisations should make sure some people have responsibility for scanning the marketplace of options so that high value add products can be adopted quickly. 
  3. Start designing training for staff on how to make the most of generative tools everyday. The result of tech giants rapidly integrating these functions is that all workers will find GenAI options starting to pop up in software that they already use. For organisations with less tech confident staff, or who want to make changes as quickly as possible, they should consider rolling out training. This will be particularly important as Copilot becomes available later in 2023. Beyond this, educate teams on how to use tools like ChatGPT and Supermind Ideator to explore problems; tools like Afforai to digest large text documents and answer questions on them; tools like Playground AI to create bespoke images in place of stock photos. YouTube guides on how to use these services are readily available. Nobody has more than a few months headstart with any of these products, so for teams that embrace new opportunities there is the chance to become expert quickly. 
  4. Prepare for others to use Generative AI as they interact with your services. To adapt a phrase: ask not what Generative AI can do for you, but what you can about Generative AI in the hands of others. The majority of companies we speak to have started thinking about how they might integrate new tools into their workflows. Comparatively fewer have grasped the fact that others will quickly start sending them AI generated content, at volumes and quality which will break many existing processes. Job applications will be autogenerated from the description; customers will be able to buy services which write complaints emails about every single purchase, hoping for a refund; legal challenges and proposals for work will get easier to write and multiply. At the most pernicious end, phishing emails seeking to compromise IT Security will become more convincing and tailored to the individual target. Correct responses to these challenges are still forming, but I am encouraged to see work like Civic AI Observatory between NESTA and Newspeak House starting to help civic organisations tackle many of these challenges. 

Organisations across industries are grappling with the challenge posed by rapidly evolving technology and the consequent changes to working practices. Paradigm Junction help companies and public bodies to:

  • Stay abreast of developments in this fast paced environment 
  • Apply them to the specific context of your industry and organisation 
  • Turn this into concrete actions you can take to mitigate risks or seize a competitive advantage

For more information on how Paradigm Junction can help you & your business email james@paradigmjunction.com

Related posts

Computers aren't supposed to be able to do that

18 months ago he would have been right. To me, it had stopped being remarkable they now can.

Introduction to Futures and Foresight for Emerging Technologies

No one is sure about how the capabilities of AI will develop, let alone how business, government and society will respond to them. But this doesn’t mean that you should stand still. Tools exist for helping decision makers make smart choices in the face of uncertainty about the future, where traditional forecasts are liable to falter.

Apple Vision Pro - Seeing The World Through A New Lens

The Vision Pro isn’t only an AR/VR play from Apple - it’s a first bid to equip us with the tools we will want to be using in the future world of work.

Models and Beauty Contests

In the face of a changing world, refusing to act until you can be sure of the outcome is a decision in and of itself. Sometimes wait and see is the correct decision. Other times, it is an invitation to competitors to steal a march.