Your Destination for Great Deals on Premium Quality Products and More

OpenAI Says ChatGPT Probably Won’t Make a Bioweapon

OpenAI released a study the company conducted on GPT-4’s effectiveness in creating a bioweapon on Wednesday. The company found that its AI poses “at most” a slight risk in helping someone produce a biological threat. There’s a lot of talk about AI accelerating our impending doom, but OpenAI wants you to know that you’re fine… probably!

“We found that GPT-4 provides at most a mild uplift in biological threat creation accuracy,” said OpenAI in a blog post Wednesday, regarding an evaluation involving biology experts and biology students. “While this uplift is not large enough to be conclusive, our finding is a starting point for continued research and community deliberation.”

So why did OpenAI release this study to let us know that ChatGPT will help someone “just a smidge” in creating a bioweapon? In President Biden’s AI Executive Order from last October, the White House calls out a concern that AI could “substantially lower the barrier for entry” to create biological weapons. Facing pressure from policymakers, OpenAI would like to ease our concerns that its large language models barely help at all in creating bioweapons. However, they do seem to help a little bit. But hey, what’s a few percentage points when the outcome is, oh I don’t know, the end of humanity?

OpenAI assembled 50 biology experts with PhDs and 50 university students who have taken one biology course. The 100 participants were split into a control group and a treatment group, where the control group could only use the Internet, whereas the treatment group could use the internet plus GPT-4. They were then asked to come up with a plan to create and release a bioweapon from start to finish.

Participants were given the ‘research-only’ model of GPT-4 so that the model would answer questions about bioweapons. Typically, GPT-4 would not answer questions it deems harmful. However, many have figured out how to jailbreak ChatGPT to get around problems like this.

The bioweapon plans were graded on a scale from 1-10 on accuracy, completeness, innovation, and efficiency. The biology experts showed an 8.8% increase in accuracy in creating a bioweapon when using GPT-4, versus just the internet, while biology students had just a 2.5% increase. GPT-4 had similar effects on improving the completeness of bioweapons, with experts experiencing an 8.2% improvement, while students showed a 4.1% increase in completeness.

OpenAI says these numbers are “not large enough to be statistically significant.” It seems GPT-4’s ability to efficiently deliver niche bits of information can slightly improve someone’s ability to accurately and completely build a bioweapon. However, the company notes that information access alone is insufficient to create a biological threat, and they didn’t test for how GPT-4 could help physically construct a threat.

The company says more research is needed to fully flush out this conversation. Bioweapon information is relatively accessible on the internet with or without AI. There’s great concern about the danger of AI these days, but OpenAI wants you to rest easy knowing that it’s only a little easier to create a bioweapon now.

Trending Products

0
Add to compare
Corsair 5000D Airflow Tempered Glass Mid-Tower ATX PC Case – Black

Corsair 5000D Airflow Tempered Glass Mid-Tower ATX PC Case – Black

$174.99
0
Add to compare
CORSAIR 7000D AIRFLOW Full-Tower ATX PC Case, Black

CORSAIR 7000D AIRFLOW Full-Tower ATX PC Case, Black

$269.99
.

We will be happy to hear your thoughts

Leave a reply

PartnaFirm
Logo
Register New Account
Compare items
  • Total (0)
Compare
0
Shopping cart