3 min read

When AI Becomes the Only Work Tool

When AI Becomes the Only Work Tool

How ChatGPT’s new “Chat with Apps” feature changes everything for knowledge-based companies – and why the demand for private language models is exploding. 

The launch of Chat with Apps marks a before-and-after moment in enterprise digitalization. OpenAI now enables language models to connect directly to external applications and data sources. This means users no longer need to switch between systems, files, and menus—you simply ask ChatGPT to retrieve a report, update a document, or generate an analysis, and it does it for you. For the first time, a language interface can become the central work tool rather than an add-on. 

The implications are profound. Where organizations once needed consultants to build integrations between systems, a language model can now connect directly to cloud solutions—with a single click. For accounting firms, law offices, and consulting companies, this means clients will increasingly expect to retrieve information themselves. Instead of ordering reports and analyses, they will ask the AI directly. 

This does not just change workflows—it changes business models. 

An accounting firm that once billed for reporting and data collection will now find that a client can ask ChatGPT to “show all invoices over 50,000 from last quarter” and get the answer in seconds. Much of the value in manual data processing disappears. The competitive advantage shifts from execution to insight. What will matter going forward is building and owning private language models that understand how the business actually works—with proprietary sources, internal guidelines, and verified data. 

Here lies the challenge: while ChatGPT and similar systems are powerful in the consumer market, they cannot be used directly on all enterprise data. The information is confidential, complex, and often distributed across legacy servers, local files, and custom systems. To unlock real value, a dedicated layer must be built between the model and the data—a private, secure information foundation that can be indexed and queried across sources.

 

The need for private language models

Once a language interface gains access to your systems, the question is no longer if you use AI, but how. What previously required custom development will now be available to everyone—provided the right data access is in place. This makes data governance, access control, and indexing mission-critical topics.

Private language models—built using Retrieval-Augmented Generation (RAG)—make it possible to combine proprietary data sources with the model’s linguistic understanding. Instead of guessing based on probable words, the model retrieves answers from actual documents, systems, and databases. This allows organizations to maintain security, traceability, and control, while users enjoy the same simple experience they know from ChatGPT.

For companies with many employees and diverse systems, the key question becomes: how do you bring everything together in one solution? How do you give employees a unified gateway to information—without risking data leakage or loss of control?

 

The new architecture

The next phase of digitalization is not about buying more systems—it is about connecting them. As ChatGPT begins to communicate directly with apps, it becomes crucial that organizations have an internal architecture capable of managing:

  • Access control – ensuring sensitive information is shared only with the right people.
  • Indexing – ensuring that historical documents, emails, and reports can actually be retrieved in real time.
  • Integration with private language models – ensuring all content is used securely and remains traceable.

This requires a new kind of IT strategy. The winners will not be those who buy the most AI tools—but those who build the right infrastructure to use them.

 

Europe’s dilemma

Europe now faces a dilemma. American players are leading the development, but they do not necessarily meet European standards for security, data storage, and compliance. If AI is to be used in industries handling sensitive information—healthcare, finance, law, and the public sector—we must adopt local, private solutions.

This is where Norway and Europe can take a position. We need language models with local data processing. This is not just a question of technology—it is about sovereignty and trust.

With Chat with Apps, AI becomes the only work tool among systems. Users will expect to communicate directly with their systems—and get answers. The real differentiation going forward will lie in who can make this happen securely, efficiently, and on their own data.

Private language models are not an addition to ChatGPT—they are the prerequisite for companies to truly use the technology.

And that is where the real competition begins. 

Are you in need of a private language model and a secure and compliant AI solution? Reach out to us and we will help you.

What is Level 1 AI Implementation and Should My Business Start There?

What is Level 1 AI Implementation and Should My Business Start There?

"We just cut our email writing time by 80% using ChatGPT!" they said. "Is this what AI implementation looks like?" Well, yes and no. What they had...

Read More
What is Level 2 AI Implementation? Static RAG - Your Private AI Library

What is Level 2 AI Implementation? Static RAG - Your Private AI Library

"We have 10 years of documentation, but no one can find answers to anything."

Read More
The AI Security Conversation Every Business Leader Needs to Have Today

The AI Security Conversation Every Business Leader Needs to Have Today

Last week, I was sitting in a café when I overheard something every business leader needs to be aware of. The people on the table next to me were...

Read More