- Nov 2024
-
www.youtube.com www.youtube.com
-
when this technology meets it that we're not that our Interiors are not completely taken over because this technology is so potent when it you know it be very easy to lose our souls right to to to to decondition to be so conditioned so quickly by the dopamine whatever these you know whatever is going to happen when we kind of when this stuff rolls
Very important. This is why we are meeting AI as it evolves. We are training it in our language and with our QUALIA
-
just going back to the AI to the extent that the that the fourth turning meets the people who are actually doing the AI and informs the AI that actually the wheel goes this way don't listen to those guys it goes this way
for - AI - the necessity of training AI with human development - John Churchill
-
- Aug 2024
-
www.youtube.com www.youtube.com
-
for example our standard english language model is trained with something like maybe 100 gigabytes or so of text um that gives it a strength as if you would throw bird at it with the google corpus so the other thing is of course uh a small corpus like that is computed in two hours or three hours on a on a laptop yeah so that's the other thing uh by the way i didn't mention our fingerprints are actually a boolean so when we when we train as i said we are not using floating points
for - comparison - cortical io vs normal AI - training dataset size and time
-
- Jun 2024
-
-
suppose that GPT 4 training took 3 months in 2027 a leading AI lab will be able to train a GPT 4 00:18:19 level model in a minute
for - stat - AI evolution - prediction 2027 - training time - 6 OOM decrease
stat - AI evolution - prediction 2027 - training time - 6 OOM decrease - today it takes 3 months to train GPT 4 - in 2027, it will take 1 minute - That is, 131,400 minutes vs 1 minute, or - 6 OOM
-
- May 2024
-
meta.stackexchange.com meta.stackexchange.com
-
I feel violated, cheated upon, betrayed, and exploited.
-
What could possibly go wrong? Dear Stack Overflow denizens, thanks for helping train OpenAI's billion-dollar LLMs. Seems that many have been drinking the AI koolaid or mixing psychedelics into their happy tea. So much for being part of a "community", seems that was just happy talk for "being exploited to generate LLM training data..." The corrupting influence of the profit-motive is never far away.
-
If you ask ChatGPT to cite it will provide random citations. That's different from actually training a model to cite (e.g. use supervised finetuning on citations with human raters checking whether sources match, which would also allow you to verify how accurately a model cites). This is something OpenAI could do, it just doesn't.
-
-
openai.com openai.com
-
We train our models using:
Tags
Annotators
URL
-
- Sep 2023
-
www.semanticscholar.org www.semanticscholar.org
-
For a socially and economically sustainable growth path, the labor displacement in the sectors ofapplication must be counterbalanced by job creation within the same and other sector
it's 2023 and I don't see anyone planning for this massive job displacement, I think that the hollywood strikes are a sign of things to come
-
- May 2023
-
ourworldindata.org ourworldindata.orgBooks1
-
A book is defined as a published title with more than 49 pages.
[24] AI - Bias in Training Materials
-
- Jun 2018
-
cognitiveclass.ai cognitiveclass.ai
-
Nice site sponsored by IBM providing lot of training materials for AI, MachineLearning and programming
Tags
Annotators
URL
-
-
spark.apache.org spark.apache.org
-
Collaborative Filtering sample with Apache Spark
This framework can be used for recommender systems.
-