Is This A Dark Age Of Technology?

 

We live in a world where information cannot be taken for granted. Images, documents, and briefings can be modified or interpreted in different ways to suit different agendas. Understanding what is really going on is often extremely hard and trying to determine what is true, versus what is a falsehood is all but impossible. This may sound downbeat, but it is hard not to reach this conclusion and wonder how to discern fact from fiction in the information age. Three events in the last week have highlighted the ease by which information can be manipulated and used to shape opinions even where in each case some form of fakery, misrepresentation or lying has occurred to shape the story narrative.

The first example is the story from the Mail on Sunday which alleged that the ‘Kings Guards’ were living in squalor. In an article which ran as their top news story for much of the day, it was alleged that troops lived in squalid surroundings, had no litter cleaning, foul toilets and disgusting fridges to use. This was put across as an example of daily life for the troops in the Household Division guarding His Majesty.  The story got a wide distribution given its strong combination of shock factor and the potential for outrage both at the state of the accommodation and the perceived failings by the Chain of Command in not addressing the issues in it.

A lot of people felt that something wasn’t quite right about the story though and started trying to investigate it. Within hours it had emerged that the story was allegedly ‘fake news’. It emerged that a lot of the photos had been taken over an extended period of time in different circumstances and shared on a military whatsapp group. They had been used by a disaffected former soldier who provided them to the Mail along with a truly sensational story. The story was it seems utter rubbish, much like the image of the fridge that was provided, that a simple google images search found came from Reddit 4 years previously under the name ‘cursed mini fridge’. Despite the story being widely debunked, it remains live and unedited online by the Mail, even though it is showing false images. Perhaps one for IPSO to investigate?

The reason this matters is that it shows how easy it can be to manipulate imagery and information to form a false story that has no bearing in reality. It shows that if you allow images to be easily shared and accessible then anyone can use them for nefarious purposes. At the very least it highlights the importance of providing good social media guidance to military personnel and on the importance of doing proper research into images. All too often we assume that an image in the media is true and the circumstances described are accurate – this should tell us that we need to be far more questioning about what we see.

The second story of concern was the emergence of imagery suggesting that there had been some kind of nuclear incident in Europe involving a US B61 tactical nuclear weapon. The Federation of American Scientists (FAS) had found an image buried deeply in an online publicly accessible archive that could indicate a nuclear weapon incident had occurred.

The incident shows a bent rear of a nuclear weapon with staff around it inspecting it – the report initially suggested that it could be the first recorded incident in Europe of this type. Initially the US military refused to comment on nuclear matters, enabling an information vacuum to emerge that led to wild speculation that there was some kind of cover up involving nuclear weapons in Europe. The only problem was that the image itself was nothing of the sort – it was taken from a briefing presentation from the ‘Accident Response Group’ that is responsible for handling incidents of this sort and was in fact a training image taken during a training exercise. It was only later on that the US military confirmed that no incident had occurred and that there was no reason for concern.


This is notable for two reasons, firstly the incident reminds us of how easy it is to carry out geolocation and use imagery to confirm information. The full article shows the work carried out to spot the likely location of the weapon based on other publicly available images and the type of in depth analysis conducted to confirm its location. This reminds us that Open Source Intelligence (OSINT) is a phenomenally powerful discipline that can be used to great effect to support intelligence work. With the huge rise in images stored online it is dangerously easy to host innocuous images that could be used by hostile powers for intelligence analysis or targeting work through accretion of data. It is vital people understand that posting images that even hint at locations or fittings in buildings can easily be used by hostile powers for nefarious purposes.

The other concern is that it is remarkably easy to shape public debate and opinion in a  hurry through misleading reporting that isn’t quickly resolved. Opinions are quickly formed and rarely hang around for the facts of the matter to become clear. If a story that indicates a nuclear weapons incident has occurred comes out then its likely that people will be hugely concerned. A number of pressure groups such as CND used it as an opportunity to make statements to further their cause and policy goals, with articles remaining live even after it was clear this was a ‘non story’. The lesson here is that the ‘no comment’ approach that has sufficed for nearly 80 years of nuclear operations is no longer sufficient. If no comment is made then an information vacuum emerges which can be exploited  by those who want to cause mischief. A key lesson of the internet age is that it isn’t possible to say nothing at all and that rapid response and rebuttal is the key here – how do governments and armed forces respond to issues that they’d rather not discuss at all in a way that prevents silence from causing deeper problems?

The deeper problems that can be caused include the faking of information to help shape an entirely different narrative. This is exactly what is happening at the time of writing as it seems that classified US intelligence material relating to Ukraine has been leaked onto the internet, but in the process amended to show very different results. It is being suggested that Russian figures have doctored the statistics showing estimated Ukrainian casualties to make them much higher than they actually are, and that their munitions and supplies are in a worse situation. The challenge here is trying to work out what is actually true – are the leaked documents accurate, are they an extremely clever effort to muddy the waters and fight an information operations campaign that shapes public views or is there something else going on here?  This incident highlights the difficulty of working out what to believe and when to believe it – if you see these documents leaked, how do you work out which is correct and which is inaccurate or faked? Do most people have the time or interest in doing so, or do they just believe the first thing they read?

From a military perspective the growth in leaking information and editing it highlights the continued importance of being able to conduct proper ‘psyops’ and ensure that there is a clear and robust communications chain that provides verifiably true narratives. If the public do not believe what their national government and media outlets are telling them then its much harder to make the case for operations or to help people understand progress. It is easier than ever to create fake news or mislead people and the military and government communicators need to be alert to this and have ways of calling out fakery quickly.

This move to a growing grey area of ‘fake news’ comes at a fascinating time for wider students of government policy. The National Archives in the UK releases government files each year showing how policy decisions have been made and providing insight into what was going on behind the scenes of Whitehall. Historically these releases have been of huge files of typewritten memos and documentation that spanned years of work, telling the story of government from a practical perspective. They serve as a ‘narrative of the truth’ in that these documents were the official government record of what really went on. But the value of these file releases may quickly dwindle.

As the National Archives begins to release information from the 2000s onwards, we will see a shift in file content away from beautifully curated paper files to intermittent glances of policy depending on what files and records were saved electronically. One of the great disasters of the 2000s was the indecent haste by which Government moved from paper files to ‘paperless’ (hah!) offices and stopped running  registries with the same diligence and efficiency. The downsizing of many admin clerks who had spent decades curating files, ensuring they were properly stored and keeping them up to date and replacing them with haphazard training on how to upload documentation to shared areas and team sites that had mixed results is a grim story. So much of what has happened since is unlikely to have been fully captured or recorded and future historians will only have glimpses of records that some people saved, while others will see their work not recorded or protected.

This is worrying for two reasons, firstly it makes it hard to understand our own national journey as a nation if there isn’t adequate filing of records to explain what was done and why. There will be many events to come where people are expecting releases of information in future decades, only to discover the records are flimsy and without meaningful content. Historians will struggle to make much of this period, particularly given the rapid decline of the written word in the form of letters, memos and other ways of communicating with each other. There will be a void of history where once there was plenty.

The other concern is the growth of AI and the means to rapidly create documents that are historically false, but which can be used to shape a convenient narrative. Even in the infancy of ChatGPT it is clear that the potential exists to create documents or records that can be used to shape understanding or gently twist historical truth. Over time as the participants fade away we are reliant on snippets of history to understand what did, or did not , really happen. It is extremely plausible to think that in 30-50 years time we could see radical revisionism occurring of historical incidents or closer in, using ChatGPT to create news articles or blogs that call into question events that actually happened. The risk is that it will become ever harder to spot fake information or to tell truth from fiction. In 150 years time, how will researchers tell the stories of the early 21st century and how will they know what was real and what was made up by an AI machine?

This may sound rather far fetched but we need to consider how easy it is for fake information to spread and how believable it can be. If we don’t take appropriate steps to spot it, tackle it and provide a robust counter narrative then we run the risk that history will be written not by the victors, but by the robots of the losers. Ensuring that the truth prevails amid a rise of fakery will be ever more important. Perhaps the time is rapidly approaching that we need to ask ‘are we entering a dark age of technology’ and if so, what do we do about it?

Comments

Popular posts from this blog

OP WILMOT - The Secret SBS Mission to Protect the QE2

Is It Time To Close BRNC Dartmouth?

"Hands to Action Stations" Royal Navy 1983 Covert Submarine Operations Off Argentina...