{"id":1988,"date":"2025-07-18T13:49:04","date_gmt":"2025-07-18T13:49:04","guid":{"rendered":"https:\/\/myanmarmuslim.news\/en\/?p=1988"},"modified":"2025-07-18T13:49:05","modified_gmt":"2025-07-18T13:49:05","slug":"why-seeing-isnt-believing-anymore-deepfakes-and-ai-generated-scams","status":"publish","type":"post","link":"https:\/\/myanmarmuslim.news\/en\/2025\/07\/18\/why-seeing-isnt-believing-anymore-deepfakes-and-ai-generated-scams\/","title":{"rendered":"Why seeing isn\u2019t believing anymore: Deepfakes, and AI-generated scams"},"content":{"rendered":"\n<p>KUALA LUMPUR, July 18&nbsp;\u2014 In today\u2019s digital world, seeing is no longer believing.<\/p>\n\n\n\n<p>With Artificial Intelligence (AI) becoming increasingly sophisticated, fake videos, audio clips and images that look and sound eerily real known as deepfakes are emerging as one of the biggest threats to truth and trust online.<\/p>\n\n\n\n<p>According to David Chak, co-founder and director of Arus Academy, which runs media literacy education programmes across Malaysia, deepfakes are no longer just a futuristic fear.<\/p>\n\n\n\n<p><strong>What is a deepfake?<\/strong><\/p>\n\n\n\n<p>Chak explained that deepfakes are highly realistic videos, images or audio recordings created using AI, specifically machine learning.<\/p>\n\n\n\n<p>He said these tools are trained to imitate a person\u2019s appearance, voice or mannerisms using existing footage or recordings so convincingly that they can trick even the most tech-savvy viewer.<\/p>\n\n\n\n<p>\u201cFor example, with enough audio of Prime Minister Datuk Seri Anwar Ibrahim available online, AI can generate a deepfake video of him saying something completely fabricated whether that\u2019s a political statement or something as absurd as endorsing Oreo biscuits.<\/p>\n\n\n\n<p>\u201cThese tools can be used for creative or entertainment purposes, but in the wrong hands, they are powerful tools of deception,\u201d he told&nbsp;<em>Malay Mail<\/em>.<\/p>\n\n\n\n<p><strong>Main types of AI-generated content<\/strong><\/p>\n\n\n\n<p>Chak highlighted three main categories of AI-generated fake content:<\/p>\n\n\n\n<p><strong>1. Fake visuals (deepfake videos)<\/strong><\/p>\n\n\n\n<p>Videos of someone appearing to do or say something they never did. One disturbing example is deepfake pornography, where someone\u2019s face is placed onto explicit content without their consent.<\/p>\n\n\n\n<p><strong>2. Fake audio<\/strong><\/p>\n\n\n\n<p>AI can mimic someone\u2019s voice based on publicly available recordings. It is now possible to generate phone calls or voice notes that sound exactly like a politician, celebrity or even a loved one.<\/p>\n\n\n\n<p><strong>3. Combined audio and visual<\/strong><\/p>\n\n\n\n<p>When visuals and audio are merged into a single synthetic video, the result can be indistinguishable from real footage.<\/p>\n\n\n\n<p>These are often disguised as breaking news or public announcements to manipulate emotions and spread misinformation.<\/p>\n\n\n\n<p><strong>How to detect AI-generated content<\/strong><\/p>\n\n\n\n<p>Also weighing in on the same matter, AI researcher with the Malaysian Research Accelerator for Technology and Innovation (MRANTI) Dr Afnizanfaizal Abdullah said there are a range of techniques to identify content that has been manipulated using AI.<\/p>\n\n\n\n<p><strong>1. Unnatural blinking and facial movement<\/strong><\/p>\n\n\n\n<p>One common giveaway is how a person blinks.<\/p>\n\n\n\n<p>In real life, people blink around 15 to 20 times per minute with slight variation, but deepfakes may show unnatural blinking patterns or none at all.<\/p>\n\n\n\n<p>Changes in facial features from one video frame to the next may not align with natural human movement, making the footage appear subtly off.<\/p>\n\n\n\n<p><strong>2. Facial asymmetry and visual inconsistencies<\/strong><\/p>\n\n\n\n<p>Minor imbalances in facial symmetry especially around the eyes and mouth can indicate manipulation.<\/p>\n\n\n\n<p>Deepfake videos often contain visual flaws, such as noticeable differences in image quality between the face and background caused by unusual compression.<\/p>\n\n\n\n<p>Edges around the altered parts of the face may look poorly blended or unnatural.<\/p>\n\n\n\n<p><strong>3. Lighting and shadow mismatches<\/strong><\/p>\n\n\n\n<p>Lighting inconsistencies such as mismatched shadows, highlights or reflections can make the video appear unrealistic.<\/p>\n\n\n\n<p><strong>4. Frequency and noise anomalies<\/strong><\/p>\n\n\n\n<p>Deepfakes can leave behind unusual frequency patterns in both the audio and video signals.<\/p>\n\n\n\n<p>Artificial clips often have different background noise or grain compared to authentic recordings, which can be detected through technical analysis.<\/p>\n\n\n\n<p><strong>Existing detection tools struggling to keep up<\/strong><\/p>\n\n\n\n<p>Although detection tools are improving, Afnizanfaizal said they are still struggling to keep pace with increasingly sophisticated AI.<\/p>\n\n\n\n<p>He cited research showing that detection accuracy can drop from 90 per cent to below 60 per cent after a video is forwarded or reshared several times.<\/p>\n\n\n\n<p>He then explained that commercial tools such as Sentinel, Reality Defender and DuckDuckGoose AI offer detection services using algorithms that analyse facial landmarks, motion consistency and spectral patterns.<\/p>\n\n\n\n<p>However, these are most effective when analysing original, high-resolution content.<\/p>\n\n\n\n<p>\u201cAfter three or four compression cycles, the digital fingerprints that help us detect deepfakes are often lost. That makes platforms like TikTok and WhatsApp especially challenging environments for verification,\u201d he added.<\/p>\n\n\n\n<p>He also warned that synthetic audio and voice-cloning technologies are increasingly being exploited in Malaysian fraud cases, with the sophistication of these threats rising at an alarming pace.<\/p>\n\n\n\n<p>From a technical standpoint, the barriers to voice cloning have crumbled where modern AI voice synthesis can generate convincing clones using as little as 30 seconds of recorded speech.<\/p>\n\n\n\n<p>\u201cThis accessibility has democratised voice cloning for criminal purposes, shifting it from the exclusive domain of state-level actors to tools now easily available to petty criminals,\u201d he said.<\/p>\n\n\n\n<p>He added that criminal syndicates now use automated systems to extract voice samples from social media, video calls and phone recordings to quickly generate cloned voices.<\/p>\n\n\n\n<p>Some operations even maintain databases of voice profiles, specifically targeting high-value individuals or those with a strong social media presence.<\/p>\n\n\n\n<p>Looking ahead, he said that voice-based threats are likely to become even more advanced, incorporating emotional nuance, more accurate accent replication and potentially real-time language translation.<\/p>\n\n\n\n<p><strong>Call for public awareness and regulations<\/strong><\/p>\n\n\n\n<p>Technology expert and CEO of local IoT company Favoriot Sdn Bhd Dr Mazlan Abbas believes that while detection tools are important, the best defence lies in public awareness and stronger regulation.<\/p>\n\n\n\n<p>Despite the increasing prevalence of such content, he said Malaysia currently has no specific legislation addressing deepfakes.<\/p>\n\n\n\n<p>\u201cEnforcement agencies are still relying on existing laws such as the Communications and Multimedia Act to investigate malicious content or scams.<\/p>\n\n\n\n<p>\u201cBut frankly, the technology is moving too fast for us to keep relying on old frameworks alone. We need regulations that are fit for this new digital era,\u201d he said.<\/p>\n\n\n\n<p>When asked whether AI literacy and deepfake awareness should be formally introduced into the national education curriculum, he said it is crucial to start equipping the younger generation for the realities of an AI-driven future.<\/p>\n\n\n\n<p>\u201cWe need to prepare our young people for this AI-driven world. Teaching deepfake awareness, digital ethics and AI literacy in schools will equip them with the critical thinking skills to question, verify and navigate the content they encounter online,\u201d he said.<\/p>\n\n\n\n<p>However, he cautioned that education efforts should not overlook older generations, as they are among the most vulnerable targets for scammers using deepfakes.<\/p>\n\n\n\n<p><em>* Coming up in Part 2: Real voices, real faces \u2014 all faked. We break down how AI-generated scams are hitting Malaysians hard, from cloned boss calls to deepfake videos featuring politicians.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>KUALA LUMPUR, July 18&nbsp;\u2014 In today\u2019s digital world, seeing is no longer believing. With Artificial Intelligence (AI) becoming increasingly sophisticated, fake videos, audio clips and images that look and sound eerily real known as deepfakes are emerging as one of the biggest threats to truth and trust online. According to David Chak, co-founder and director [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":1989,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[7,2,25],"tags":[],"class_list":["post-1988","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-articles","category-international-news","category-science-it-ai-military-war"],"_links":{"self":[{"href":"https:\/\/myanmarmuslim.news\/en\/wp-json\/wp\/v2\/posts\/1988","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/myanmarmuslim.news\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/myanmarmuslim.news\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/myanmarmuslim.news\/en\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/myanmarmuslim.news\/en\/wp-json\/wp\/v2\/comments?post=1988"}],"version-history":[{"count":1,"href":"https:\/\/myanmarmuslim.news\/en\/wp-json\/wp\/v2\/posts\/1988\/revisions"}],"predecessor-version":[{"id":1990,"href":"https:\/\/myanmarmuslim.news\/en\/wp-json\/wp\/v2\/posts\/1988\/revisions\/1990"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/myanmarmuslim.news\/en\/wp-json\/wp\/v2\/media\/1989"}],"wp:attachment":[{"href":"https:\/\/myanmarmuslim.news\/en\/wp-json\/wp\/v2\/media?parent=1988"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/myanmarmuslim.news\/en\/wp-json\/wp\/v2\/categories?post=1988"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/myanmarmuslim.news\/en\/wp-json\/wp\/v2\/tags?post=1988"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}