mirror of
https://github.com/privacyguides/i18n.git
synced 2025-11-16 21:32:39 +00:00
New Crowdin translations by GitHub Action
This commit is contained in:
@@ -12,7 +12,7 @@ cover: ai-chatbots.webp
|
||||
- [:material-account-cash: Surveillance Capitalism](basics/common-threats.md#surveillance-as-a-business-model){ .pg-brown }
|
||||
- [:material-close-outline: Censorship](basics/common-threats.md#avoiding-censorship){ .pg-blue-gray }
|
||||
|
||||
Since the release of ChatGPT in 2022, interactions with Large Language Models (LLMs) have become increasingly common. LLMs can help us write better, understand unfamiliar subjects, or answer a wide range of questions. They can statistically predict the next word based on a vast amount of data scraped from the web.
|
||||
The use of **AI chat**, also known as Large Language Models (LLMs), has become increasingly common since the release of ChatGPT in 2022. LLMs can help us write better, understand unfamiliar subjects, or answer a wide range of questions. They work by statistically predicting the next word in their responses based on a vast amount of data scraped from the web.
|
||||
|
||||
## Privacy Concerns About LLMs
|
||||
|
||||
@@ -42,7 +42,7 @@ To run AI locally, you need both an AI model and an AI client.
|
||||
|
||||
### Choosing a Model
|
||||
|
||||
There are many permissively licensed models available to download. [Hugging Face](https://huggingface.co/models) is a platform that lets you browse, research, and download models in common formats like [GGUF](https://huggingface.co/docs/hub/en/gguf). Companies that provide good open-weights models include big names like Mistral, Meta, Microsoft, and Google. However, there are also many community models and 'fine-tunes' available. As mentioned above, quantized models offer the best balance between model quality and performance for those using consumer-grade hardware.
|
||||
There are many permissively licensed models available to download. [Hugging Face](https://huggingface.co/models) is a platform that lets you browse, research, and download models in common formats like [GGUF](https://huggingface.co/docs/hub/en/gguf). Companies that provide good open-weights models include big names like Mistral, Meta, Microsoft, and Google. However, there are also many community models and [fine-tuned](https://en.wikipedia.org/wiki/Fine-tuning_\(deep_learning\)) models available. As mentioned above, quantized models offer the best balance between model quality and performance for those using consumer-grade hardware.
|
||||
|
||||
To help you choose a model that fits your needs, you can look at leaderboards and benchmarks. The most widely-used leaderboard is the community-driven [LM Arena](https://lmarena.ai). Additionally, the [OpenLLM Leaderboard](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard) focuses on the performance of open-weights models on common benchmarks like [MMLU-Pro](https://arxiv.org/abs/2406.01574). There are also specialized benchmarks which measure factors like [emotional intelligence](https://eqbench.com), ["uncensored general intelligence"](https://huggingface.co/spaces/DontPlanToEnd/UGI-Leaderboard), and [many others](https://www.nebuly.com/blog/llm-leaderboards).
|
||||
|
||||
@@ -63,7 +63,7 @@ To help you choose a model that fits your needs, you can look at leaderboards an
|
||||
|
||||
{align=right}
|
||||
|
||||
Kobold.cpp is an AI client that runs locally on your Windows, Mac, or Linux computer. It's an excellent choice if you are looking for heavy customization and tweaking, such as for role-playing purposes.
|
||||
**Kobold.cpp** is an AI client that runs locally on your Windows, Mac, or Linux computer. It's an excellent choice if you are looking for heavy customization and tweaking, such as for role-playing purposes.
|
||||
|
||||
In addition to supporting a large range of text models, Kobold.cpp also supports image generators such as [Stable Diffusion](https://stability.ai/stable-image) and automatic speech recognition tools such as [Whisper](https://github.com/ggerganov/whisper.cpp).
|
||||
|
||||
@@ -83,7 +83,7 @@ In addition to supporting a large range of text models, Kobold.cpp also supports
|
||||
|
||||
</div>
|
||||
|
||||
<div class="admonition note" markdown>
|
||||
<div class="admonition info" markdown>
|
||||
<p class="admonition-title">Compatibility Issues</p>
|
||||
|
||||
Kobold.cpp might not run on computers without AVX/AVX2 support.
|
||||
@@ -98,7 +98,7 @@ Kobold.cpp allows you to modify parameters such as the AI model temperature and
|
||||
|
||||
{align=right}
|
||||
|
||||
Ollama is a command-line AI assistant that is available on macOS, Linux, and Windows. Ollama is a great choice if you're looking for an AI client that's easy-to-use, widely compatible, and fast due to its use of inference and other techniques. It also doesn't involve any manual setup.
|
||||
**Ollama** is a command-line AI assistant that is available on macOS, Linux, and Windows. Ollama is a great choice if you're looking for an AI client that's easy-to-use, widely compatible, and fast due to its use of inference and other techniques. It also doesn't involve any manual setup.
|
||||
|
||||
In addition to supporting a wide range of text models, Ollama also supports [LLaVA](https://github.com/haotian-liu/LLaVA) models and has experimental support for Meta's [Llama vision capabilities](https://huggingface.co/blog/llama32#what-is-llama-32-vision).
|
||||
|
||||
@@ -124,9 +124,9 @@ Ollama simplifies the process of setting up a local AI chat by downloading the A
|
||||
|
||||
<div class="admonition recommendation" markdown>
|
||||
|
||||
{align=right}
|
||||
{align=right}
|
||||
|
||||
Llamafile is a lightweight single-file executable that allows users to run LLMs locally on their own computers without any setup involved. It is [backed by Mozilla](https://hacks.mozilla.org/2023/11/introducing-llamafile) and available on Linux, macOS, and Windows.
|
||||
**Llamafile** is a lightweight, single-file executable that allows users to run LLMs locally on their own computers without any setup involved. It is [backed by Mozilla](https://hacks.mozilla.org/2023/11/introducing-llamafile) and available on Linux, macOS, and Windows.
|
||||
|
||||
Llamafile also supports LLaVA. However, it doesn't support speech recognition or image generation.
|
||||
|
||||
@@ -138,7 +138,9 @@ Llamafile also supports LLaVA. However, it doesn't support speech recognition or
|
||||
<details class="downloads" markdown>
|
||||
<summary>Downloads</summary>
|
||||
|
||||
- [:fontawesome-solid-desktop: Desktop](https://github.com/Mozilla-Ocho/llamafile#quickstart)
|
||||
- [:fontawesome-brands-windows: Windows](https://github.com/Mozilla-Ocho/llamafile#quickstart)
|
||||
- [:simple-apple: macOS](https://github.com/Mozilla-Ocho/llamafile#quickstart)
|
||||
- [:simple-linux: Linux](https://github.com/Mozilla-Ocho/llamafile#quickstart)
|
||||
|
||||
</details>
|
||||
|
||||
@@ -171,11 +173,11 @@ Please note we are not affiliated with any of the projects we recommend. In addi
|
||||
|
||||
### 最低要件
|
||||
|
||||
- Must be open-source.
|
||||
- オープンソースであること。
|
||||
- Must not transmit personal data, including chat data.
|
||||
- Must be multi-platform.
|
||||
- Must not require a GPU.
|
||||
- Must support GPU-powered fast inference.
|
||||
- Must support GPU-powered, fast inference.
|
||||
- Must not require an internet connection.
|
||||
|
||||
### 満たされることが望ましい基準
|
||||
@@ -186,4 +188,11 @@ Our best-case criteria represent what we _would_ like to see from the perfect pr
|
||||
- Should have a built-in model downloader option.
|
||||
- The user should be able to modify the LLM parameters, such as its system prompt or temperature.
|
||||
|
||||
\*[LLaVA]: Large Language and Vision Assistant (multimodal AI model)
|
||||
\*[LLM]: Large Language Model (AI model such as ChatGPT)
|
||||
\*[LLMs]: Large Language Models (AI models such as ChatGPT)
|
||||
\*[open-weights models]: AI models that anyone can download and use, but the underlying training data and/or algorithms for them are proprietary.
|
||||
\*[system prompt]: The general instructions given by a human to guide how an AI chat should operate.
|
||||
\*[temperature]: A parameter used in AI models to control the level of randomness and creativity in the generated text.
|
||||
|
||||
[^1]: A file checksum is a type of anti-tampering fingerprint. A developer usually provides a checksum in a text file that can be downloaded separately, or on the download page itself. Verifying that the checksum of the file you downloaded matches the one provided by the developer helps ensure that the file is genuine and wasn't tampered with in transit. You can use commands like `sha256sum` on Linux and macOS, or `certutil -hashfile file SHA256` on Windows to generate the downloaded file's checksum.
|
||||
|
||||
@@ -19,7 +19,7 @@ Even if you use OpenPGP, it does not support [forward secrecy](https://en.wikipe
|
||||
|
||||
## What is the Web Key Directory standard?
|
||||
|
||||
The Web Key Directory (WKD) standard allows email clients to discover the OpenPGP key for other mailboxes, even those hosted on a different provider. Email clients which support WKD will ask the recipient's server for a key based on the email address' domain name. For example, if you emailed `jonah@privacyguides.org`, your email client would ask `privacyguides.org` for Jonah's OpenPGP key, and if `privacyguides.org` has a key for that account, your message would be automatically encrypted.
|
||||
The [Web Key Directory (WKD)](https://wiki.gnupg.org/WKD) standard allows email clients to discover the OpenPGP key for other mailboxes, even those hosted on a different provider. Email clients which support WKD will ask the recipient's server for a key based on the email address' domain name. For example, if you emailed `jonah@privacyguides.org`, your email client would ask `privacyguides.org` for Jonah's OpenPGP key, and if `privacyguides.org` has a key for that account, your message would be automatically encrypted.
|
||||
|
||||
In addition to the [email clients we recommend](../email-clients.md) which support WKD, some webmail providers also support WKD. Whether *your own* key is published to WKD for others to use depends on your domain configuration. If you use an [email provider](../email.md#openpgp-compatible-services) which supports WKD, such as Proton Mail or Mailbox.org, they can publish your OpenPGP key on their domain for you.
|
||||
|
||||
|
||||
@@ -34,7 +34,7 @@ global:
|
||||
|
||||
## OpenPGP対応サービス
|
||||
|
||||
OpenPGPによる暗号化・復号化や[Web Key Directory規格](basics/email-security.md#what-is-the-web-key-directory-standard)をネイティブサポートしているプロバイダーでは、プロバイダーに依存しないE2EE(エンドツーエンド暗号化)メールが利用可能です。 例えば、Proton MailのユーザはMailbox.orgのユーザにE2EEメッセージを送れますし、OpenPGPで暗号化された通知を、それをサポートするインターネットサービスから受け取ることができます。
|
||||
These providers natively support OpenPGP encryption/decryption and the [Web Key Directory (WKD) standard](basics/email-security.md#what-is-the-web-key-directory-standard), allowing for provider-agnostic E2EE emails. 例えば、Proton MailのユーザはMailbox.orgのユーザにE2EEメッセージを送れますし、OpenPGPで暗号化された通知を、それをサポートするインターネットサービスから受け取ることができます。
|
||||
|
||||
<div class="grid cards" markdown>
|
||||
|
||||
@@ -107,7 +107,7 @@ Proton Mailはメールと [カレンダー](https://proton.me/news/protoncalend
|
||||
|
||||
#### :material-check:{ .pg-green } メールの暗号化
|
||||
|
||||
Proton Mailはwebメールに [OpenPGP暗号化を組み込んでいます。](https://proton.me/support/how-to-use-pgp) 他のProton Mailアカウントへのメールは自動的に暗号化され、OpenPGPキーによる非Proton Mailアドレスへの暗号化はアカウント設定から簡単に有効化できます。 Protonは[Web Key Directory(WKD)](https://wiki.gnupg.org/WKD)による外部の鍵の自動探索にも対応しています。 WKDを使った他のプロバイダーに送信されるEメールは自動的にOpenPGPで暗号化され、PGP公開鍵と連絡先を手動で交換する必要はありません。 また、[Proton Mailではないアドレスに送るメッセージをOpenPGPを使わずに暗号化する](https://proton.me/support/password-protected-emails)こともでき、受信者はProton Mailアカウントへのサインアップが必要ありません。
|
||||
Proton Mailはwebメールに [OpenPGP暗号化を組み込んでいます。](https://proton.me/support/how-to-use-pgp) 他のProton Mailアカウントへのメールは自動的に暗号化され、OpenPGPキーによる非Proton Mailアドレスへの暗号化はアカウント設定から簡単に有効化できます。 Proton also supports automatic external key discovery with WKD. WKDを使った他のプロバイダーに送信されるEメールは自動的にOpenPGPで暗号化され、PGP公開鍵と連絡先を手動で交換する必要はありません。 また、[Proton Mailではないアドレスに送るメッセージをOpenPGPを使わずに暗号化する](https://proton.me/support/password-protected-emails)こともでき、受信者はProton Mailアカウントへのサインアップが必要ありません。
|
||||
|
||||
Proton MailではProtonアカウントの公開鍵をWKDからHTTP経由で公開します。 これにより、Proton Mailを使っていない人でも、Proton MailアカウントのOpenPGPキーを簡単に見つけることができ、プロバイダをまたいだE2EEが可能になります。 @proton.meのようなProtonが所有するドメインのEメールアドレスのみ対象です。 カスタムドメインを使用する場合、[WKDの設定](./basics/email-security.md#what-is-the-web-key-directory-standard)が必要になります。
|
||||
|
||||
@@ -164,7 +164,7 @@ Mailbox.orgでは[暗号化されたメールボックス](https://kb.mailbox.or
|
||||
|
||||
Mailbox.orgのウェブメールは[暗号化機能が組みこまれており](https://kb.mailbox.org/en/private/e-mail-article/send-encrypted-e-mails-with-guard)、OpenPGP公開鍵を持つ人へのメッセージの送信が簡単にできます。 Mailbox.orgのサーバー上で[受信者がEメールの復号化をすることもできます](https://kb.mailbox.org/en/private/e-mail-article/my-recipient-does-not-use-pgp)。 この機能はリモートの受信者がOpenPGPを持っておらず、自分のメールボックスにあるメールのコピーを複合できない場合に便利です。
|
||||
|
||||
Mailbox.orgは [Web Key Directory (WKD)](https://wiki.gnupg.org/WKD) からHTTP経由で公開鍵を発見することもサポートしています。 これにより、Mailbox.orgを使っていない人でも、Mailbox.orgアカウントのOpenPGPキーを簡単に見つけることができ、プロバイダをまたいだE2EEが可能になります。 @mailbox.orgのようなMailbox.orgが所有するドメインのEメールアドレスのみ対象です。 カスタムドメインを使用する場合、[WKDの設定](./basics/email-security.md#what-is-the-web-key-directory-standard)が必要になります。
|
||||
Mailbox.org also supports the discovery of public keys via HTTP from their WKD. これにより、Mailbox.orgを使っていない人でも、Mailbox.orgアカウントのOpenPGPキーを簡単に見つけることができ、プロバイダをまたいだE2EEが可能になります。 @mailbox.orgのようなMailbox.orgが所有するドメインのEメールアドレスのみ対象です。 カスタムドメインを使用する場合、[WKDの設定](./basics/email-security.md#what-is-the-web-key-directory-standard)が必要になります。
|
||||
|
||||
#### :material-information-outline:{ .pg-blue } アカウントの停止
|
||||
|
||||
@@ -323,7 +323,7 @@ Stalwartにはウェブメールが**ない**ため、[専用のEメールクラ
|
||||
|
||||
- ゼロアクセス暗号化により、すべてのアカウントのデータ(連絡先、カレンダーなど)が暗号化されていること。
|
||||
- 利便性のため、E2EE/PGP暗号化できるウェブメールがあること。
|
||||
- HTTP経由でのOpenPGP公開鍵の探索を改善するため、[WKD](https://wiki.gnupg.org/WKD)へ対応していること。 GnuPGでは次のスクリプトで鍵を取得できます: `gpg --locate-key example_user@example.com`
|
||||
- Support for WKD to allow improved discovery of public OpenPGP keys via HTTP. GnuPGでは次のスクリプトで鍵を取得できます: `gpg --locate-key example_user@example.com`
|
||||
- 外部ユーザー用の一時的なメールボックスがあること。 受信者に実際のメールのコピーを送るのではなく、暗号化されたメールを送る際に役立ちます。 通常の場合、一時的なメールボックスのメールには期限があり、自動的に削除されます。 また、受信者はOpenPGPのような暗号化を設定する必要がありません。
|
||||
- [.onionサービス](https://en.wikipedia.org/wiki/.onion)経由でEメールプロバイダーのサービスが利用できること。
|
||||
- [サブアドレス](https://en.wikipedia.org/wiki/Email_address#Sub-addressing)に対応していること。
|
||||
|
||||
@@ -356,7 +356,7 @@ We [recommend](dns.md#recommended-providers) a number of encrypted DNS servers b
|
||||
<div class="grid cards" markdown>
|
||||
|
||||
- { .twemoji loading=lazy } [Kobold.cpp](ai-chat.md#koboldcpp)
|
||||
- { .twemoji loading=lazy } [Llamafile](ai-chat.md#llamafile)
|
||||
- { .twemoji loading=lazy } [Llamafile](ai-chat.md#llamafile)
|
||||
- { .twemoji loading=lazy } [Ollama (CLI)](ai-chat.md#ollama-cli)
|
||||
|
||||
</div>
|
||||
|
||||
Reference in New Issue
Block a user