• Menu

0 recente resultaten

Bye Bye Big Tech Step 5: AI assistents and chatbots

Leaving Big Tech might not always seem easy at first. Where do you begin, and what are actually good alternatives? To help you on your way, we’ve put together a step-by-step guide with alternatives. Here you can find all the steps.

It is almost impossible to imagine our online environment without AI assistants. After ChatGPT was launched in 2022, everyone fell over each other with predictions: education would change rigorously, "stupid" work would cease to exist and/or Al would become aware and disrupt the whole society.

Next to such grand predictions, it is interesting to see that the business model of Large Language Models (LLMs), such as ChatGPT, is not so groundbreaking. The models behind AI assistants are trained using huge amount of data found online, often without user permission. You might have already guessed: this data contains a lot of sensitive personal data. It is especially reprehensible that the largest and most widely used AI systems are developed and hosted by Big Tech companies. Therefore, the data they use to train their AI's with also comes from other products they supply.

But it's not just the dependence on Big Tech that's makes AI usage thorny. It is also important to realize that people are being exploited to make generative AI possible. AI training data must be filtered and labeled, which is done under very poor working conditions. In addition, AI has a major impact on the climate. Data center parks the size of small cities are dedicated to just AI. They consume huge amounts of electricity, leading to a lot of CO2 emissions.

If you still want to work with an AI assistant or chatbot, always be critical of the answers it gives. It is increasingly difficult to figure out how an algorithm comes to a reaction or decision and who should be held responsible when an the answer is wrong.

Fortunately, there are safe ways to work with AI. If you do not want AI to be trained on the information you send and for this information to not be shared with third parties, it is advisable to choose an AI that you can run locally on your device. To do this you have to download an AI model. Your exchanges with that AI never leaves your device and the AI works locally and offline. That's more private and more safe. Please note that some AI models take up more memory than others. Also, don't expect them to be as fast and smart as the models at ChatGPT.

Our recommedations

Here you can read the selection criteria and conditions for the recommendations below.

Jan.ai (desktop)

  • https://www.jan.ai/;
  • AI models are local, on your own device;
  • does not train with user data;
  • requires a minimum of 8GB RAM of memory;
  • multiple open source AI models available.

PocketPal (mobiel)

  • AI models are local, on your own device;
  • does not train with user data;
  • requires a minimum of 8GB RAM of memory;
  • multiple open source AI models available.

OLMo 2

Proton Lumo

  • https://lumo.proton.me;
  • Is run on Proton servers (does not send sensitive information to Lumo);
  • Is not trained with user data;
  • Has to comply with the strong Swiss privacy laws;
  • Works in the browser and has a mobile app;

How to switch (desktop)

For AI on your desktop, you can choose for Jan.ai, which works both locally and offline on your machine, or you can choose Proton Lumo in the browser.

  • 1

    Download and install

    Go to https://www.jan.ai/ and download the app. If the download is completed, you can install in on your machine.

  • 2

    Choose a model to use

    To choose a suitable model, you have to pay attention to how much memory your device has. For a laptop with a memory of 8GB the '7B' models can be used. We recommend OLMo 2, because it is developed fully open source, including the training data. Other interesting models are Mistral, developed by Mistral AI in France or EuroLLM, a language model that supports all European languages.

    The different versions of these models can all be found on Hugging Face, a platform where AI models, datasets and applications are shared. You can find all OLMo 2 models here, all Mistral models here and all EuroLLM models here.

  • 3

    Download the model in Jan.ai

    Open the app and click 'Hub' in the left menu. Look for the model you chose in the previous step. Can't you find see the model when looking for 'OLMo-2'? Then copy the model name including developer and paste it into the search bar; for OLMo-2, for example: Allenai/OLMo-2-1124-7B-Instruct-GGUF.

    Choose a model with 'instruct' and 'GUF' in the name. 'Instruct' means that the model is optimized to chat with. Models without instruct are not recommended. 'GGUF' is a format that Jan.ai uses to load the models, meaning that models without 'GGUF' format can not be read by the Jan app.

    Examples for models of Mistral and EuroLLM suitable for use in the Jan.ai app:

    MaziyarPanahi/Mistral-Nemo-Instruct-2407-GGUF
    member project/EuroLLM-9B-Instruct (despite its name this is a GGUF format)

    Select the model you are looking for and download it. Jan.ai will then prepare the model for you.

  • 4

    Use the model in a chat

    Go to 'New Chat', select the model you just downloaded under the chat window and chat away!

    Does the Jan.ai app crash or get stuck after you ask a question? Then chances are you've chosen a language model that is not suitable for your device. Try a lighter model:

    Did you use OLMo-2 7B? Try OLMo-2 1B (allenai/OLMo-2-0425-1B-Instruction-GGUF for Jan.ai's hub) or find another less heavy model under 7B (e.g. 4B).
    Did you use EuroLLM 9B? Try OLMo-2 7B (allenai/OLMo-2-1124-7B-Instruct-GGUF for Jan.ai's hub) or find another less heavy model under 9B (e.g. 7B).
    Did you use Mistral Nemo? Try OLMo-2 7B (allenai/OLMo-2-1124-7B-Instruct-GGUF for Jan.ai's hub) or find another less heavy model under 12B (e.g. 9B).

How to switch (Mobile)

To use AI on your phone you can pick between PocketPal, which is local and offline on your device, or for Proton Lumo.

  • 1

    Download en install the app

    Depending on your operating system, go to the App Store or to the Play Store and download your PocketPal, to run AI locally, or the Proton Lumo app. If you choose the Proton Lumo app, you will be done and ready to go. If you go for a local AI, then you still have to complete the next steps.

  • 2

    Choose a model to use

    To choose a suitable model, you have to pay attention to how much memory your device has. For a mobile phone with a memory of 8GB the '7B' models can be used. We recommend OLMo 2, because it is developed fully open source, including the training data. Other interesting models are Mistral, developed by Mistral AI in France or EuroLLM, a language model that supports all European languages.

    The different versions of these models can all be found on Hugging Face, a platform where AI models, datasets and applications are shared. You can find all OLMo 2 models here, all Mistral models here and all EuroLLM models here.

  • 3

    Download het model in PocketPal

    Open PocketPal, click on the menu icon at the top left, click on 'Models' and then on the plus icon at the bottom right of the screen. Click 'Add from Hugging Face'. Look for the model you chose in the previous step. Make sure you choose a model that has "instruct" in its name. This means that the model is optimized to chat with.

    Once you find the model, click on the name.  You will get a pop-up with different versions of the model. To avoid diving into the technical details of what each version stands for, you can search for the version that ends with '...Q4_K_M.gguf' and download it. PocketPal will then prepare the model.

  • 4

    Use the model to chat

    Go to 'Chat in the model', select the model you just downloaded under the chat window and chat away!

    Does the PocketPal crash or get stuck after you ask a question? Then chances are you've chosen a language model that is not suitable for your device. Try a lighter model:

    • Did you use OLMo-2 7B? Try OLMo-2 1B or find another less heavy model on Hugging Face (e.g. 4B).
      Did you use Mistral 7B? Try OLMo-2 1B or find another less heavy model on Hugging Face (e.g. 4B).

"Most of the time AI operates within a black box. You know what goes in and what comes out, but not what happens in between. AI always contains, just like all technology, a bias and hallucinates. You can never blindly trust AI. Make sure you can still answer the question: which data has been processed by whom, where does the result come from and is the result correct? Always stay critical of your AI-usage"

Martijn

Wow, great job!

You have taken a big step to freedom from Big Tech by switching to open source AI assistants and chat bots. Away from prying eyes and uncheckable, dominant companies that will alsways be hungry for more data. Have fun talking to an AI on your terms, and not those of a company!

Share this step with your network

Would you like to show others you made the switch to Signal and encourage others to make the switch too? Share this page with your network. The more people that wave Big Tech goodbye, the bigger the impact! You can also download the badges and put them in your email signature. Do you also want to take action offline? Order our stickerset in the webshop.

Help mee en steun ons

Door mijn bijdrage ondersteun ik Bits of Freedom, dat kan maandelijks of eenmalig.

Dankjewel supporter van vrij internet!

Je ontvangt de inlogcode via de mail.

Als donateur ontvang je elk kwartaal een speciale update, maar als je up-to-date wil blijven over ons werk kun je het beste abonneren op onze nieuwsbrieven. Schrijf je hieronder in!

    Gelukt!

    Je ontvangt nu ook onze nieuwsbrief. Je kunt deze popup sluiten.

    Er ging iets mis tijdens de betaling

    Je betaling is niet juist afgehandeld, probeer nog eens.

    Support en doneer!

    Meer weten over doneren aan ons? Lees er hier alles over.