Artificial Intelligence (AI)

From Kicksecure
Jump to navigation Jump to search

Artificial Intelligence is an Euphemism. / AI is often called Open Source but is actually only non-freedom software.

Artificial Intelligence is an Euphemism[edit]

Artificial intelligence at this time should not be called "intelligence." It has no intelligence whatsoever.

LLMs do not perform reasoning over data in the way that most people conceive or desire.

There is no self-reflection of its information; it does not know what it knows and what it does not. The line between hallucination and truth is simply a probability factored by the prevalence of training data and post-training processes like fine-tuning. Reliability will always be nothing more than a probability built on top of this architecture.

As such, it becomes unsuitable as a machine to find rare hidden truths or valuable neglected information. It will always simply converge toward popular narrative or data. At best, it can provide new permutations of views of existing well-known concepts, but it can not invent new concepts or reveal concepts rarely spoken about.https://www.mindprison.cc/p/the-question-that-no-llm-can-answerarchive.org iconarchive.today icon

“Artificial Intelligence”

The moral panic over ChatGPT has led to confusion because people often speak of it as “artificial intelligence.” Is ChatGPT properly described as artificial intelligence? Should we call it that? Professor Sussman of the MIT Artificial Intelligence Lab argues convincingly that we should not.

Normally, “intelligence” means having knowledge and understanding, at least about some kinds of things. A true artificial intelligence should have some knowledge and understanding. General artificial intelligence would be able to know and understand about all sorts of things; that does not exist, but we do have systems of limited artificial intelligence which can know and understand in certain limited fields.

By contrast, ChatGPT knows nothing and understands nothing. Its output is merely smooth babbling. Anything it states or implies about reality is fabrication (unless “fabrication” implies more understanding than that system really has). Seeking a correct answer to any real question in ChatGPT output is folly, as many have learned to their dismay.

That is not a matter of implementation details. It is an inherent limitation due to the fundamental approach these systems use.

[...]GNU projectarchive.org iconarchive.today icon

Mislabeling a text generator as "intelligence" has the disadvantage of laymen attributing traits to the text generator that do not exist in reality. Undue trust is assigned to its output, verification is omitted, and the text generator is considered an oracle or even god-like.

Neutral Words[edit]

  • text generator
  • predictive text model
  • word probability calculator
  • word sequence prediction model
  • automatic word completion
  • language model with prediction functionality

Misattribution of Intelligence[edit]

ELIZA already existed in 1967. Simple chatbot. 420 lines of source code. Simple string matching.

ELIZA was a symbolic AI chatbot developed in 1966 by Joseph Weizenbaum and imitating a psychotherapist. Many early users were convinced of ELIZA's intelligence and understanding, despite its basic text-processing approach and the explanations of its limitations.ELIZA effectarchive.org iconarchive.today icon

However, many early users were convinced of ELIZA's intelligence and understanding, despite Weizenbaum's insistence to the contrary.ELIZAarchive.org iconarchive.today icon

Negative Effects by Artificial Intelligence[edit]

Model Collapse[edit]

When an AI "reads" (is trained) content from itself or other AIs, the AI produces greater nonsense with each iteration. This is called model collapse.

Model collapse is a phenomenon in artificial intelligence (AI) where trained models, especially those relying on synthetic data or AI-generated data, degrade over time.https://www.infobip.com/glossary/model-collapsearchive.org iconarchive.today icon

Open Source Requirements[edit]

The fundamental components required for an AI to be classified as Open Source and Freedom Software and the significance of these classifications.

  • AI model source code: The source code should be freely accessible and published under a license approved by organizations such as OSI, FSF or DSFG.
  • Training data: Ideally, the training data should also be available under an approved license. However, in some cases this is not possible, e.g. with personal data. If the training data is non-freedom, the project would fall into categories like "contrib", as is the case with Debian.
  • Build documentation steps: Clear instructions for compiling or training the model from source code must be provided so that third parties can reproduce the model.
  • Dependencies: All software libraries and packages required for the AI model should also be Open Source or Freedom Software. Dependencies that are non-freedom would also put the software in a "contrib" category.
  • Configuration files and scripts: Often, in addition to the items mentioned above, special configuration files or scripts are also required to successfully train or run the AI model. These should also be under an approved license.
  • License File: A clear license file that explains the terms and conditions for use, modification, and distribution of the software is essential.

Overall, free access to all resources required for the AI model is crucial for its classification as Open Source and Freedom Software. Without these components, the project could be considered partially free, but would not meet the full criteria for Open Source and Freedom Software.

Open Source AI Definition - Lack of Consensus in the Definition by the Open Source Initiative[edit]

The The Open Source AI Definition – 1.0archive.org iconarchive.today icon by the Open Source Initiativearchive.org iconarchive.today icon lacks community consensus. It represents one perspective on what constitutes open-source AI, but there are differing views within the AI and open-source communities regarding the requirements, limitations, and implications of truly Open Source AI.

Security[edit]

Being Open Source is essential for the avoidance of backdoors.

Reproducible / Deterministic Builds[edit]

Reproducible or deterministic builds are a crucial aspect of Open Source and Freedom Software as they contribute to the transparency, trustworthiness, and verifiability of the software. A reproducible build ensures that given the same source code, build environment and build instructions, the binary output will always be identical. This is vital for verifying that the build is free from malicious alterations or unintended deviations from the source code.

  • Verification: By comparing the checksum of the build output from an independent build process with the checksum of the official release, any discrepancies can be identified, ensuring that the binary has been compiled correctly and hasn’t been tampered with.
  • Debugging: Deterministic builds make debugging easier as developers can work with exact copies of the software, ensuring consistency between testing and production environments.
  • Collaboration: When multiple developers or teams work on the same project, reproducible builds ensure that everyone is working with the exact same binary, reducing the likelihood of inconsistent behavior and bugs due to environment differences.
  • Compliance and Auditing: For projects that require adherence to certain regulatory or compliance standards, reproducible builds provide a clear audit trail of what code was compiled and how.
  • Long-Term Maintenance: In cases where a project needs to be maintained or updated over a long period, reproducible builds ensure that it’s always possible to recreate the exact original build environment, making future maintenance and debugging far simpler.

Reproducible builds are an essential practice in achieving the goals of Open Source and Freedom Software, contributing significantly to the integrity, transparency, and community collaboration inherent in these projects.

Misuse of the Term Open Source in the Context of Some AI Projects[edit]

The misuse of the term "Open Source" by certain members of the AI community is indeed concerning, as it can lead to misunderstandings regarding the actual licensing and accessibility of the AI projects in question. The word "Real" had to be prefixed to "Open Source" in the title of this page to underscore this issue.

Instances like the one where Meta (formerly Facebook) released a large AI language model and it was heralded as an open-source AI projectarchive.org iconarchive.today icon by certain publications, exhibit this misuse. The articles misrepresent the true nature of the release, as Meta's AI is not genuinely Open Source. When one attempts to download the Meta AIarchive.org iconarchive.today icon, they are confronted with a proprietary license agreement, indicating that the AI does not conform to the open source ethos of freedom and accessibility.

This misrepresentation could potentially mislead individuals and organizations interested in utilizing or contributing to Open Source AI projects. It's essential for the community to adhere to the accurate usage of the term "Open Source", ensuring that it remains synonymous with the principles of free, accessible, and transparent software development.

The term Open Sourcearchive.org iconarchive.today icon has been established for decades, embodying a set of values centered around transparency, collaboration, and freedom in software development.

Other Issues[edit]

Freeware Self-Hosted Artificial Intelligence[edit]

Not real Open Source. Only freeware and self-hosted.

For system security it is strongly advised to not install proprietaryarchive.org iconarchive.today icon, non-freedomarchive.org iconarchive.today icon software. Instead, use of Free Softwarearchive.org iconarchive.today icon is recommendedarchive.org iconarchive.today icon.

Possible risks associated with using non-freedom software:

  • Potential advanced backdoors or malware in the software itself.
  • Privacy breaches. Possibly key logger?
  • Software that depends on third party servers could access identifying information for payments or logins linked to real identity.

For more information on installing third-party free softwarearchive.org iconarchive.today icon, consult the Foreign Sources page for advice. See also: Is It Ever a Good Thing to Use a Nonfree Program?archive.org iconarchive.today icon

Open Source softwarearchive.org iconarchive.today icon like Qubes, Debianarchive.org iconarchive.today icon and Kicksecurearchive.org iconarchive.today icon is more secure than proprietary/closed sourcearchive.org iconarchive.today icon software. The public scrutiny of security by designarchive.org iconarchive.today icon has proven to be superior to security through obscurityarchive.org iconarchive.today icon. This aligns the software development process with Kerckhoffs' principlearchive.org iconarchive.today icon - the basis of modern cipherarchive.org iconarchive.today icon-systems design. This principle asserts that systems must be secure, even if the adversary knows everything about how they work. Generally speaking, Freedom Software projects are much more open and respectful of the privacy rights of users. Freedom Software projects also encourage security bug reports, open discussion, public fixes and review.

As Free Software pioneer Richard Stallmanarchive.org iconarchive.today icon puts it:

  • "... If you run a nonfree program on your computer, it denies your freedom; the main one harmed is you. ..."
  • "Every nonfree program has a lord, a master -- and if you use the program, he is your master.“
  • "To have the choice between proprietary software packages, is being able to choose your master. Freedom means not having a master. And in the area of computing, freedom means not using proprietary software."

Or as the GNU projectarchive.org iconarchive.today icon puts it:

  • Proprietary Software Is Often Malwarearchive.org iconarchive.today icon

  • Nonfree (proprietary) software is very often malware (designed to mistreat the user). Nonfree software is controlled by its developers, which puts them in a position of power over the users; that is the basic injusticearchive.org iconarchive.today icon. The developers and manufacturers often exercise that power to the detriment of the users they ought to serve.

  • This typically takes the form of malicious functionalities.

  • Some malicious functionalities are mediated by back doors.

  • Back door: any feature of a program that enables someone who is not supposed to be in control of the computer where it is installed to send it commands. (added by editor "Most times without consent or awareness.")

The GNU project created a list with examples of Proprietary Back Doorsarchive.org iconarchive.today icon. The Electronic Frontier Foundationarchive.org iconarchive.today icon (EFF) has other examples of the use of back doorsarchive.org iconarchive.today icon.

Related: Why Kicksecure is Freedom Software


Complete hardware + software setup for running Deepseek-R1 locally. The actual model, no distillations, and Q8 quantization for full quality. Total cost, $6,000. All download and part links below:https://x.com/carrigmat/status/1884244369907278106archive.org iconarchive.today icon

Resources[edit]

Tickets[edit]

See Also[edit]

search term:

DFSG compliant AI

We believe security software like Kicksecure needs to remain Open Source and independent. Would you help sustain and grow the project? Learn more about our 12 year success story and maybe DONATE!