
All Newslinks - Page: 15
| BBCWorld - 10 hours ago (BBCWorld)South Korea`s data protection regulator says user data was sent to the Chinese owner of TikTok. Read...Newslink ©2025 to BBCWorld |  |
|  | | PC World - 10 hours ago (PC World)While Chromebooks are generally limited compared to Windows and macOS laptops, I personally made the switch to Chromebooks full-time a while ago and haven’t looked back. One of the big reasons for that switch — and why I’m a big Chromebook advocate to this day — is just how effortlessly secure ChromeOS is for everyday users.
Google built ChromeOS to be as secure as possible, with features designed to limit your exposure to malware. Sure, you can find some of these features in other operating systems too, but all of these coming together in one overall package is what makes ChromeOS great.
Here are the core security features of ChromeOS that make Chromebooks safe and keep you protected while using your laptop.
Related: The best Chromebooks worth buying today
Sandboxing for everything
Sandboxing is a technique where certain apps and processes are run in isolated environments, aptly called “sandboxes.” You can think of a sandbox as a virtual bubble that has limited access to the overall system. By running software in a bubble like this, you’re protected in case it’s infected with malware that tries to spread.
You may be familiar with Windows Sandbox, but you need Windows 11 Pro to access that feature. Meanwhile, in ChromeOS, sandboxing isn’t optional — everything from system services to browser tabs are run within their own separate sandboxes, and these sandboxes operate with the fewest possible privileges. They only have access to the resources they need, limiting the amount of damage they can do if compromised.
So even if you catch a malware infection, there’s little chance that the attack could escalate privileges and affect critical processes. In fact, over many years of using ChromeOS, I have yet to experience a single security issue, let alone a major put-me-in-full-on-panic-mode issue.
Verified Boot for OS authenticity
Dave Parrack / Foundry
Verified Boot means that every time you start ChromeOS, it checks to make sure that the system hasn’t been corrupted or tampered with since the last time it ran. This is done using cryptographically signed system images, which ensure that everything running on your Chromebook is as expected and as it should be.
First, ChromeOS checks the firmware in a read-only partition (that attackers can’t access or change). Next, ChromeOS checks and compares the kernel and system files to ensure nothing has been altered.
If everything checks out, ChromeOS boots normally. But if something (anything) is out of place, ChromeOS either reverts to a previous (secure) version of the operating system or, in extreme cases, prompts you to reinstall ChromeOS in Recovery Mode.
Read-only system files
As I mentioned above, ChromeOS has a read-only partition for core system files, including the kernel, system libraries, and other essential components. This partition can’t be altered. (ChromeOS has a separate read/write partition for settings, apps, user data, and the like.)
Doing this protects the core system files from things like malicious modification by hackers, but it also protects against accidental harm — by poorly written apps, rogue extensions, user error, etc.
What about when core system files need updating? ChromeOS first applies updates to an inactive partition while the system is being used. Then, when you next reboot your Chromebook, it switches partitions and applies the Verified Boot. If an error is detected, ChromeOS reverts to the previous version of the operating system.
Regular automatic updates
Dave Parrack / Foundry
One thing I love about ChromeOS is the stress-free update process. Unlike Windows updates, ChromeOS updates are automatic, consistent, and in the background without any user involvement beyond restarting your Chromebook when updates are complete.
Regular system updates are so important for patching security flaws and vulnerabilities. When updating is a huge ordeal, you end up putting it off and putting it off until you have time for it. With ChromeOS, updates are frequent, which means each update is relatively small and painless, and then you restart in a matter of seconds. It’s easy!
Given how often Google updates ChromeOS, the operating system is able to combat existing and emerging threats quickly and seamlessly, and that keeps you protected.
Recovery Mode and Safety Reset
Most operating systems have a recovery mode, so ChromeOS isn’t unique just for having one — but it does have one and Recovery Mode does keep ChromeOS secure. Plus, the big difference here is that Recovery Mode in ChromeOS is more user-friendly than in, say, Windows.
Recovery Mode is a way to restore the operating system back to factory settings (or an earlier version), which comes in handy when something goes wrong and the system stops working. That could happen due to corrupted system files, a failed update, performance issues, etc.
With ChromeOS, you can use Recovery Mode to reinstall the operating system while clearing all user data, and then you can restore that user data from your Google account. More recently, Google even implemented a new Safety Reset feature that lets you reinstall ChromeOS without losing your data.
Cloud-first approach for data
Google’s cloud-first approach is divisive, but it does have some positive implications for security. For starters, cloud-based apps are less susceptible to malware versus traditional apps. They aren’t completely immune, but the difference is non-trivial.
Having sensitive data stored in the cloud also lessens the risks associated with loss or theft of your Chromebook. And if your Chromebook does get lost or stolen, you can easily revoke access to your data (so the thief can’t do anything with it) and you can recover your data by signing into your cloud accounts on a different device.
And for schools or businesses that manage hundreds of Chromebooks through Google Admin Console, cloud control can ensure that policies are enforced, apps are deployed (or blocked), and everyone’s devices are kept up-to-date at all times.
Limited access to third-party apps
Dave Parrack / Foundry
For the most part, if you want to download and install apps on your Chromebook, you’re doing it through the Google Play Store. And while the Play Store isn’t perfect, it does have a vetting process that helps minimize the chance of running into malware.
Can you install third-party apps on your Chromebook? Yeah, but it’s risky. You can also install Android and Linux apps from some sources. Fortunately, Google warns you when you try to install unknown apps like this — and again, apps are run in sandboxes, which protects the rest of your system in case you somehow bring malware aboard.
Further reading: Chromebooks vs. laptops: What you need to know Read...Newslink ©2025 to PC World |  |
|  | | PC World - 10 hours ago (PC World)The AI chatbot ChatGPT from Open AI has triggered the hype surrounding generative artificial intelligence and dominates much of the media coverage.
However, in addition to the AI models from Open AI, there are other chatbots that deserve attention. And unlike ChatGPT, these are also available for local use on the PC and can even be used free of charge for an unlimited period of time.
We’ll show you four local chatbots that also run on older hardware. You can talk to them or create texts with them.
The chatbots presented here generally consist of two parts, a front end and an AI model, the large language model.
You decide which model runs in the front end after installing the tool. Operation is not difficult if you know the basics. However, some of the chatbots offer very extensive setting options. Using these requires expert knowledge. However, the bots can also be operated well with the standard settings.
See also: What is an AI PC, exactly? We cut through the hype
What local AI can do
What you can expect from a local large language model (LLM) also depends on what you offer it: LLMs need computing power and a lot of RAM to be able to respond quickly.
If these requirements are not met, the large models will not even start and the small ones will take an agonizingly long time to respond. Things are faster with a current graphics card from Nvidia or AMD, as most local chatbots and AI models can then utilize the hardware’s GPU.
If you only have a weak graphics card in your PC, everything has to be calculated by the CPU — and that takes time.
If you only have 8GB of RAM in your PC, you can only start very small AI models. Although they can provide correct answers to a number of simple questions, they quickly run into problems with peripheral topics. Computers that offer 12GB RAM are already quite good, but 16GB RAM or more is even better.
Then even AI models that work with 7 to 12 billion parameters will run smoothly. You can usually recognize how many parameters a model has by its name. At the end, an addition such as 2B or 7B stands for 2 or 7 billions.
Recommendation for your hardware: Gemma 2 2B, with 2.6 billion parameters, already runs with 8GB RAM and without GPU support. The results are generally fast and well structured. If you need an even less demanding AI model, you can use Llama 3.2 1B in the chatbot LM Studio, for example.
If your PC is equipped with a lot of RAM and a fast GPU, try Gemma 2 7B or a slightly larger Llama model, such as Llama 3.1 8B. You can load the models via the chatbots Msty, GPT4All, or LM Studio.
Information on the AI models for the Llama files can be found below. And for your information: ChatGPT from Open AI is not available for the PC. The apps and PC tools from Open AI send all requests to the internet.
The most important steps
Using the various chatbots is very similar. You install the tool, then load an AI model via the tool and then switch to the chat area of the program. And you’re ready to go.
With the Llamafile chatbot, there is no need to download the model, as an AI model is already integrated in the Llamafile. This is why there are several Llamafiles, each with a different model.
See also: The AI PC revolution: 18 essential terms you need to know
Llamafile
Llamafiles are the simplest way to communicate with a local chatbot. The aim of the project is to make AI accessible to everyone. That’s why the creators pack all the necessary files, i.e. the front end and the AI model, into a single file — the Llamafile.
This file only needs to be started and the chatbot can be used in the browser. However, the user interface is not very attractive.
The Llamafile chatbot is available in different versions, each with different AI models. With the Llava model, you can also integrate images into the chat. Overall, Llamafile is easy to use as a chatbot.
IDG
Simple installation
Only one file is downloaded to your computer. The file name differs depending on the model selected.
For example, if you have selected the Llamafile with the Llava 1.5 model with 7 billion parameters, the file is called “llava-v1.5-7bq4.llamafile.” As the file extension .exe is missing here, you must rename the file in Windows Explorer after downloading.
You can ignore a warning from Windows Explorer by clicking “Yes.” The file name will then be: “llava-v1.5-7b-q4.llamafile.exe.” Double-click on the file to start the chatbot. On older PCs, it may take a moment for the Microsoft Defender Smartscreen to issue a warning.
Click on “Run anyway.” A prompt window opens, but this is only for the program. The chatbot does not have its own user interface, but must be operated in the browser. Start your default browser if it is not started automatically and enter the address 127.0.0.1:8080 or localhost:8080.
If you want to use a different AI model, you must download a different Llamafile. These can be found on Llamafile.ai further down the page in the “Other example llamafiles” table. Each Llamafile needs the file extension .exe.
Chatting with the Llamafile
The user interface in the browser shows the setting options for the chatbot at the top. The chat input is located at the bottom of the page under “Say something.”
If you have started a Llamafile with the model Llava (llava-v1.5-7b-q4.llamafile), you can not only chat, but also have images explained to you via “Upload Image” and “Send.” Llava stands for “Large Language and Vision Assistant.” To end the chatbot, simply close the prompt.
Tip: Llava files can be used in your own network. Start the chatbot on a powerful PC in your home network. Make sure that the other PCs are authorized to access this computer. You can then use the chatbot from there via the internet browser and the address “:8080”. Replace with the address of the PC on which the chatbot is running.
Msty
Msty offers access to many language models, good user guidance, and the import of your own files for use in the AI. Not everything is self-explanatory, but it is easy to use after a short familiarization period.
If you want to make your own files available to the AI purely locally, you can do this in Msty in the so-called Knowledge Stack. That sounds a bit pretentious. However, Msty actually offers the best file integration of the four chatbots presented here.
IDG
Installation of Msty
Msty is available for download in two versions: one with support for Nvidia and AMD GPUs and the other for running on the CPU only. When you start the Msty installation wizard, you have the choice between a local installation (“Set up local AI”) or an installation on a server.
For the local installation, the Gemma 2 model is already selected in the lower part of the window. This model is only 1.6GB in size and is well suited for text creation on weaker hardware.
If you click on “Gemma2,” you can choose between five other models. Later, many more models can be loaded from a clearly organized library via “Local AI Models,” such as Gemma 2 2B or Llama 3.1 8B.
“Browse & Download Online Models” gives you access to the AI pages www.ollama.com and www.huggingface.com and therefore to most of the free AI models.
A special feature of Msty is that you can ask several AI models for advice at the same time. However, your PC should have enough memory to respond quickly. Otherwise you will have to wait a long time for the finished answers.
Msty
Pretty interface, lots of substance
Msty’s user interface is appealing and well structured. Of course, not everything is immediately obvious, but if you familiarize yourself with Msty, you can use the tool quickly, integrate new models, and integrate your own files. Msty provides access to the many, often cryptic options of the individual models, at least partially in graphical menus.
In addition: Msty offers so-called splitchats. The user interface then displays two or more chat entries next to each other. A different AI model can be selected for each chat. However, you only have to enter your question once. This allows you to compare several models with each other.
Add your own files
You can easily integrate your own files via “Knowledge Stacks.” You can choose which embedding model should prepare your data for the LLMs.
Mixedbread Embed Large is used by default. However, other embedding tools can also be loaded. Care should be taken when selecting the model, however, as online embedding models can also be selected, for example from Open AI.
However, this means that your data is sent to Open AI’s servers for processing. And the database created with your data is also online: Every enquiry then also goes to Open AI.
Chat with your own files: After you have added your own documents to the “Knowledge Stacks,” select “Attach Knowledge Stack and Chat with them” below the chat input line. Tick the box in front of your stack and ask a question. The model will search through your data to find the answer. However, this does not work very well yet.
GPT4All
GPT4All offers a few models, a simple user interface and the option of reading in your own files. The selection of chat models is smaller than with Msty, for example, but the model selection is clearer. Additional models can be downloaded via Huggingface.com.
The GPT4All chatbot is a solid front end that offers a good selection of AI models and can load more from Huggingface.com. The user interface is well structured and you can quickly find your way around.
GPT4All
Installation: Quick and easy
The installation of GPT4All was quick and easy for us. AI models can be selected under “Models.” Models such as Llama 3 8B, Llama 3.2 3B, Microsoft Phi 3 Mini, and EM German Mistral are presented.
Good: For each model, the amount of free RAM the PC must have for the model to run is specified. There is also access to AI models at Huggingface.com using the search function. In addition, the online models from Open AI (ChatGPT) and Mistral can be integrated via API keys — for those who don’t just want to chat locally.
Operation and chat
The user interface of GPT4All is similar to that of Msty, but with fewer functions and options. This makes it easier to use. After a short orientation phase, in which it is clarified how models can be loaded and where they can be selected for the chat, operation is easy.
Own files can be made available to the AI models via “Localdocs.” In contrast to Msty, it is not possible to set which embedding model prepares the data. The Nomic-embed-textv1.5 model is used in all cases.
In our tests, the tool ran with good stability. However, it was not always clear whether a model was already fully loaded.
LM Studio
LM Studio offers user guidance for beginners, advanced users, and developers. Despite this categorization, it is aimed more at professionals than beginners. What the professionals like is that anyone working with LM Studio not only has access to many models, but also to their options.
The LM Studio chatbot not only gives you access to a large selection of AI models from Huggingface.com, but also allows you to fine-tune the AI models. There is a separate developer view for this.
LM Studio
Straightforward installation
After installation, LM Studio greets you with the “Get your first LLM” button. Clicking on it offers a very small version of Meta’s LLM: Llama 3.2 1B.
This model should also run on older hardware without long waiting times. After downloading the model, it must be started via a pop-up window and “Load Model.” Additional models can be added using the Ctrl-Shift-M key combination or the “Discover” magnifying glass symbol, for example.
Chat and integrate documents
At the bottom of the LM Studio window, you can change the view of the program using the three buttons “User,” “Power User,” and “Developer.”
In the first case, the user interface is similar to that of ChatGPT in the browser; in the other two cases, the view is supplemented with additional information, such as how many tokens are contained in a response and how quickly they were calculated.
This and the access to many details of the AI models make LM Studio particularly interesting for advanced users. You can make many fine adjustments and view information.
Your own texts can only be integrated into a chat, but cannot be made permanently available to the language models. When you add a document to your chat, LM Studio automatically decides whether it is short enough to fit completely into the AI model’s prompt or not.
If not, the document is checked for important content using Retrieval Augmented Generation (RAG), and only this content is provided to the model in the chat. However, the text is often not captured in full. Read...Newslink ©2025 to PC World |  |
|  | | BBCWorld - 11 hours ago (BBCWorld)The American star has had a top 10 single, Grammy award and a breathing exercise that`s gone viral. Read...Newslink ©2025 to BBCWorld |  |
|  | | BBCWorld - 18 Feb (BBCWorld)Edward Gardner had been threatened with a defamation action by singers at an opera house in Naples. Read...Newslink ©2025 to BBCWorld |  |
|  | | BBCWorld - 18 Feb (BBCWorld)Who failed their initiation ceremony and who tapes their mouth up to sleep? Wales captain Jac Morgan took BBC Sport`s quiz while at the Six Nations launch event. Read...Newslink ©2025 to BBCWorld |  |
|  | | BBCWorld - 18 Feb (BBCWorld)BBC Sport looks at the landscape of the heavyweight division before Daniel Dubois` world title defence against Joseph Parker on Saturday. Read...Newslink ©2025 to BBCWorld |  |
|  | | BBCWorld - 18 Feb (BBCWorld)The Champions Trophy may feel like a tournament from a bygone age but it is crucial for England, who need to lift fans after a dark winter. Read...Newslink ©2025 to BBCWorld |  |
|  | | BBCWorld - 18 Feb (BBCWorld)Jannik Sinner`s case was `a million miles away from doping`, says World Anti-Doping Agency general counsel Ross Wenzel. Read...Newslink ©2025 to BBCWorld |  |
|  | | BBCWorld - 18 Feb (BBCWorld)Roo is being cared for by Mark Navin and Lauren Owen after his mum died shortly after his birth. Read...Newslink ©2025 to BBCWorld |  |
|  |  |
|
 |
 | Top Stories |

RUGBY
The All Blacks and Ireland will face off in Chicago - nine years on from Ireland's first win between the countries More...
|

BUSINESS
Housing is still a buyer's market More...
|

|

 | Today's News |

 | News Search |
|
 |