Here’s why people are saying GPT-4 is getting lazy | Digital Trends

Here’s why people are saying GPT-4 is getting lazy | Digital Trends

OpenAI and its technologies have been in the midst of scandal for most of November. Between the swift firing and rehiring of CEO Sam Altman and the curious case of the halted ChatGPT Plus paid subscriptions, OpenAI has kept the artificial intelligence industry in the news for weeks.

Now, AI enthusiasts have rehashed an issue that has many wondering whether GPT-4 is getting “lazier” as the language model continues to be trained. Many who use it speed up more intensive tasks have taken to X (formerly Twitter) to air their grievances about the perceived changes.

OpenAI has safety-ed GPT-4 sufficiently that its become lazy and incompetent.

Convert this file? Too long. Write a table? Here's the first three lines. Read this link? Sorry can't. Read this py file? Oops not allowed.

So frustrating.

— rohit (@krishnanrohit) November 28, 2023

Rohit Krishnan on X detailed several of the mishaps he experienced while using GPT-4, which is the language model behind ChatGPT Plus, the paid version of ChatGPT. He explained that the chatbot has refused several of his queries or given him truncated versions of his requests when he was able to get detailed responses previously. He also noted that the language model will use tools other than what it has been instructed to use, such as Dall-E when a prompt asks for a code interpreter. Krishnan also sarcastically added that “error analyzing” is the language model’s way of saying “AFK [away from keyboard], be back in a couple of hours.”

Matt Wensing on X detailed his experiment, where he asked ChatGPT Plus to make a list of dates between now and May 5, 2024, and the chatbot required additional information, such as the number of weeks between those dates, before it was able to complete the initial task.

Wharton professor Ethan Mollick also shared his observations of GPT-4 after comparing sequences with the code interpreter he ran in July to more recent queries from Tuesday. He concluded that GPT-4 is still knowledgeable, but noted that it explained to him how to fix his code as opposed to actually fixing the code. In essence, he would have to do the work he was asking GPT-4 to do. Though Mollick has not intended to critique the language, his observations fall in step with what others have described as “back talk” from GPT-4.

ChatGPT is known to hallucinate answers for information that it does not know, but these errors appear to go far beyond common missteps of the AI chatbot. GPT-4 was introduced in March, but as early as July, reports of the language model getting “dumber” began to surface. A study done in collaboration with Stanford University and the University of California, Berkeley observed that the accuracy of GPT-4 dropped from 97.6% to 2.4% between March and June alone. It detailed that the paid version of ChatGPT was unable to provide the correct answer to a mathematical equation with a detailed explanation, while the unpaid version that still runs an older GPT 3.5 model gave the correct answer and a detailed explanation of the mathematical process.

During that time, Peter Welinder, OpenAI Product vice president, suggested that heavy users might experience a psychological phenomenon where the quality of answers might appear to degrade over time when the language model is actually becoming more efficient.

There has been discussion if GPT-4 has become "lazy" recently. My anecdotal testing suggests it may be true.

I repeated a sequence of old analyses I did with Code Interpreter. GPT-4 still knows what to do, but keeps telling me to do the work. One step is now many & some are odd. pic.twitter.com/OhGAMtd3Zq

— Ethan Mollick (@emollick) November 28, 2023

According to Mollick, the current issues might similarly be temporary and due to a system overload or a change in prompt style that hasn’t been made apparent to users. Notably, OpenAI cited a system overload as a reason for the ChatGPT Plus sign-up shutdown following the spike in interest in the service after its inaugural DevDay developers’ conference introduced a host of new functions for the paid version of the AI chatbot. There is still a waitlist in place for ChatGPT Plus. The professor also added that ChatGPT on mobile uses a different prompt style, which results in “shorter and more to-the-point answers.”

Yacine on X detailed that the unreliability of the latest GPT-4 model due to the drop in instruction adherence has caused them to go back to traditional coding, adding that they plan on creating a local code LLM to regain control of the model’s parameters. Other users have mentioned opting for open-source options in the midst of the language model’s decline.

Similarly, Reddit user, Mindless-Ad8595 explained that more recent updates to GPT-4 have made it too smart for its own good. “It doesn’t come with a predefined ‘path’ that guides its behavior, making it incredibly versatile, but also somewhat directionless by default,” he said.

The programmer recommends users create custom GPTs that are specialized by task or application to increase the efficiency of the model output. He doesn’t provide any practical solutions for users remaining within OpenAI’s ecosystem.

App developer Nick Dobos shared his experience with GPT-4 mishaps, noting that when he prompted ChatGPT to write pong in SwiftUI, he discovered various placeholders and to-dos within the code. He added that the chatbot would ignore commands and continue inserting these placeholders and to-dos into the code even when instructed to do otherwise. Several X users confirmed similar experiences of this kind with their own examples of code featuring placeholders and to-dos. Dobos’ post got the attention of an OpenAI employee who said they would forward examples to the company’s development team for a fix, with a promise to share any updates in the interim.

Overall, there is no clear explanation as to why GPT-4 is currently experiencing complications. Users discussing their experiences online have suggested many ideas. These range from OpenAI merging models to a continued server overload from running both GPT-4 and GPT-4 Turbo to the company attempting to save money by limiting results, among others.

It is well-known that OpenAI runs an extremely expensive operation. In April 2023, researchers indicated it took $700,000 per day, or 36 cents per query, to keep ChatGPT running. Industry analysts detailed at that time that OpenAI would have to expand its GPU fleet by 30,000 units to maintain its commercial performance for the remainder of the year. This would entail support of ChatGPT processes, in addition to the computing for all of its partners.

While waiting for GPT-4 performance to stabilize, users exchanged several quips, making light of the situation on X.

“The next thing you know it will be calling in sick,” Southrye said.

“So many responses with “and you do the rest.” No YOU do the rest,” MrGarnett said.

The number of replies and posts about the problem is definitely hard to ignore. We’ll have to wait and see if OpenAI can tackle the problem head-on in a future update.

Editors’ Recommendations






Here’s why you can’t sign up for ChatGPT Plus right now | Digital Trends

Here’s why you can’t sign up for ChatGPT Plus right now | Digital Trends

CEO Sam Altman’s sudden departure from OpenAI weekend isn’t the only drama happening with ChatGPT. Due to high demand, paid subscriptions for OpenAI’s ChatGPT Plus have been halted for nearly a week.

The company has a waitlist for those interested in registering for ChatGPT to be notified of when the text-to-speech AI generator is available once more.

Interest in ChatGPT Plus spiked following OpenAI’s inaugural DevDay developers’ conference, which took place earlier this month and unveiled a host of new functions for the paid version of the AI chatbot. Some of these features include being able to create custom bots with the GPT-4 language model that can be trained on specialized data to perform specific functions. Some of the custom GPTs include a model for Canva, a therapist model called TherapistGPT, and a Tweet enhancer for X. More general models include book creators, SEO assistants, photo critics, QR code generators, and birthday cake designers, according to ZDNet.

we are pausing new ChatGPT Plus sign-ups for a bit 🙁

the surge in usage post devday has exceeded our capacity and we want to make sure everyone has a great experience.

you can still sign-up to be notified within the app when subs reopen.

— Sam Altman (@sama) November 15, 2023

OpenAI also shared more details on GPT-4 Turbo, which is a supercharged version of the language model that can process context at 128k, double that of the standard GPT-4. Other functions enable the web-browsing capability for multimodal GPT-4 access, DALL-E 3 image generation, and advanced data analysis while being able to stay within a current model.

Don’t Miss:

Excitement over the new functions coming to ChatGPT Plus sent users rushing to sign up for the service, which costs $20 per month. CEO Sam Altman then shared on X (formerly Twitter) that the company was reinstating the waitlist for ChatGPT Plus subscriptions due to post-DevDay signups exceeding the service’s capacity to process functions.

This appears to mirror the early days of ChatGPT, when the chatbot experienced capacity issues, which caused it to experience random downtimes. This is what prompted the company to establish a paid subscription tier in the first place. In April 2023, researchers indicated it took $700,000 per day, or 36 cents per query, to keep ChatGPT running. Paid accounts, in addition to various enterprise endowments, have helped keep the chatbot free of incident for some time. ChatGPT notably supports 100 million weekly users — there are also — over 2 million developers on its platform — helping it to outpace competitor organizations such as Meta (formerly Facebook).

The service experienced an outage in early November following its DevDay conference, which left ChatGPT and its API inaccessible to free and paid users and developers for over 90 minutes. OpenAI stated that the excess traffic that caused the crash was due to a DDoS attack and not an inability to support users.

November 14 brought several changes to ChatGPT. In addition to the waitlist for ChatGPT Plus, which is still enabled as of Monday, those who have not logged in recently will be greeted with a notification of updates to the chatbot’s terms and services and privacy policy. Some highlights of the terms and services include clarifications on registration and access, information on how the service can be used, and details about content. Similarly, the updated privacy policy spells out in greater detail the information the company might collect, how it’s used, and what your rights are when using the service.

There is also a new Tips for Getting Started notice for those who haven’t logged in for a while, indicating that you should not input private information into ChatGPT and that you should double-check the information for inaccuracies.

Subscriptions for ChatGPT Plus has paused by OpenAI.

Overnight an underground market for subscriptions for sale on eBay has exploded.

Some folks paying up to 3 times the cost from direct.

We live in interesting times. pic.twitter.com/j4YqRbpNfS

— Brian Roemmele (@BrianRoemmele) November 15, 2023

Amid the ChatGPT Plus registration pause, users have been discovered reselling paid accounts on eBay for a premium. While a ChatGPT Plus subscription directly from OpenAI costs just $20, resellers are offering access to the service for two to three times that amount. Mashable noted, that while not wholly illegal, such actions are often a breach of terms of services for many businesses.

OpenAI does spell out in its recently updated terms of use that users registering for ChatGPT must “provide accurate and complete information” when registering an account and accept authority in registering an account on behalf of another. A breach in the terms of use can result in the suspension or termination of an account.

Users who successfully gain access to a resold account advertised as “one-year” might also find that OpenAI could end the account short of that time. It recommends the waitlist update as the best way to gain access to ChatGPT Plus, Mashable added.

Currently, there is no telling how the leadership shake-up at OpenAI will affect ChatGPT Plus becoming available to users once more. The free version of ChatGPT has remained functional throughout this ordeal.

Editors’ Recommendations






ChatGPT’s iPhone app now has Bing built-in | Digital Trends

ChatGPT’s iPhone app now has Bing built-in | Digital Trends

ChatGPT for iOS now offers a connection to Bing for a user experience that incorporates more up-to-date information.

There is a caveat, however, as only paid subscribers to the app’s premium Plus tier are able to take advantage of the new feature.

It means that the AI-powered ChatGPT app will now be able to pull up more recent information instead of only using older web-based data that it’s been trained on, which reaches as far as 2021.

“Plus users can now use ‘Browsing’ to get comprehensive answers and current insights on events and information that extend beyond the model’s original training data,” OpenAI said in notes for the latest version of ChatGPT for iPhone, released on Tuesday.

Users of the paid Plus tier can try it out by enabling Browsing in the “New Features” section of the app’s settings. Next, select GPT-4 in the model switcher and choose “Browse with Bing” in the drop-down.

While it’s a disappointment that the update allows engagement only with Bing and not other search engines, it’s no surprise that it’s this particular search tool that ChatGPT has incorporated. That’s because Bing is made by Microsoft, the company that earlier this year announced a major investment in OpenAI. The funding, which follows two other similar rounds of support by Microsoft in 2019 and 2021, was reportedly worth as much as $10 billion. ChatGPT is also already a part of Bing search for web and is integrated into Microsoft’s Edge browser.

OpenAI released the free ChatGPT app for iPhone last month. ChatGPT Plus costs $20 per month with benefits including smooth access to ChatGPT during peak times, faster response times, and priority access to new features and improvements.

Only just heard about the ChatGPT chatbot? It’s not too late to start learning about it, as it won’t be going away anytime soon. Digital Trends has a useful article explaining what it’s all about, and how you can try it out for yourself.

You might also want to try Google’s AI-powered chatbot, called Bard. But first, find out how it shapes up against ChatGPT.

Editors’ Recommendations






These 2 new ChatGPT features will change everything | Digital Trends

These 2 new ChatGPT features will change everything | Digital Trends

ChatGPT Plus subscribers will soon get a much more powerful version of OpenAI’s large language model, allowing it to access plugins and the internet. This will dramatically expand the capabilities and usefulness of the world’s most famous chatbot.

OpenAI shared the news via a tweet linking to the latest ChatGPT release notes. Web browsing and plugin access will begin as an optional beta feature that can be enabled in settings.

The beta panel started rolling out along with the new ChatGPT features last Friday. If you’re a ChatGPT Plus subscriber but don’t see a beta panel in settings, you won’t be able to access browsing or plugins yet.

A few early-access users got to test ChatGPT plugins in a limited form in late March. Since then, the number of available plugins has grown dramatically from a meager 11 to over 70.

Plugins provide access to various websites and services, allowing ChatGPT to perform more complicated tasks and access specialized databases. Examples include checking the weather, booking a flight, finding deals, and looking up recipes. The web browsing feature allows ChatGPT to bring in current information, similar to how Bing Chat and Google Bard work.

YouTube channel App Of The Day gives a good overview of what ChatGPT plugins look like and how they work.

ChatGPT Plugins and Web Browsing – Everything You Need to Know

After you’ve enabled plugins, ChatGPT knows when to use them, so you only need to specifically name a service if you have more than one that can help with the same task. The real power comes in combining multiple plugins, with ChatGPT orchestrating everything.

Plugins let ChatGPT do more than type messages; it can perform tasks, a bit like AutoGPT but without the hassle. ChatGPT will also be able to read everything online and access specialized knowledge bases that aren’t indexed and searchable without visiting a particular website.

Plugins and internet access will change everything and greatly expand how people use ChatGPT.

Editors’ Recommendations