[ad_1]
According to the post, the woman works in marketing and customer survey. He initially used the AI tool to draft survey questions. Later, he also started depending on it for analysis. EXCEL files uploaded, AI-generated output downloaded and presented the results as official work.
Wrong calculation introduced
The trouble began when it submitted a PowerPoint presentation to the client which included the results used using “Pearson’s Correlation Coefficient”. The Reddit user pointed to this incompatibility, stating that the survey included text-based reactions that were catarked in five “feeling” buckets, which were not suitable for such statistical methods.
Try to escape
After this incident, the user turned to Reddit and asked the users “So, what can we do to save his job?” In response, a Reddit user advised a cover-up clearance for damage control: “Say that you were using placeholder data and accidentally joined the version sent to the client.” Another said directly: “Accept the truth, face the results and learn the lesson that ‘chatgpt can make mistakes. Check important information.”
This story exposes risks that arise from excessive dependence on AI tools for professional functions, which require verificationable accuracy. While Chatgpt can help in thought-brainstorming and drafting, experts often warns that it can cause mistakes, wrong ways or fully fabricated results if it is left uncontrolled.
Disclaimer: This story is based on the post shared by a Reddit user. The details, opinions and statements provided here are only of the original poster and news does not reflect 18 Hindi ideas. We have not confirmed the claims independently.
[ad_2]


