Volumen 35 No. 3 (julio-septiembre) 2026, pp. 9-24

ISSN 1315-0006. Depósito legal pp 199202zu44

DOI: 10.5281/zenodo.19687145

Hybrid Conflicts: Digital media as instrument for manipulating. What has it been like in Ukraine?

Andriy Sogorin*, Ihor Atamanenko**, Viktor Purnak***,

Ihor Martynov**** y Oleksii Halukh*****

Abstract

In hybrid wars, digital media have become a powerful political manipulator of consciousness, so a deep understanding of the mechanisms of such manipulation is a key task in finding ways to counteract it. The purpose of the article is to identify, classify and quantify the main manipulative techniques used in digital media in the context of a hybrid conflict (on the example of the Russian-Ukrainian war), as well as to assess the specifics of their use by various social networks. To that end, a content analysis of 2024 posts from the Facebook, X (Twitter), and TikTok platforms was conducted, categorizing the samples into pro-Russian and pro-Ukrainian sources. The results revealed a profound asymmetry: manipulative techniques were more widespread in pro-Russian sources, in particular dehumanization of the enemy, conspiracy theories and visual misinformation. TikTok was the most manipulative, with visual manipulation as the main tool of influence. It turned out that there is a stable negative correlation between trust in the source of information and the amount of manipulation. The results of the study confirm the systemic nature of the information operation and point to the need for a comprehensive response based on media literacy, support for independent media, international cooperation, and accountability of digital platforms.

Keywords: digital media; manipulation; hybrid conflict; propaganda; disinformation; social media

*Kyiv Institute of the National Guard of Ukraine, Ukraine, ORCID: https://orcid.org/0009-0009-5900-7139

*Corresponding author: andrey.sogorin@ujis.in.ua

**Kyiv Institute of the National Guard of Ukraine. Ukraine. ORCID: https://orcid.org/0000-0001-8959-5423

***Kyiv Institute of the National Guard of Ukraine. Ukraine. ORCID: https://orcid.org/0009-0002-2214-9351

****Kyiv Institute of the National Guard of Ukraine. Ukraine . ORCID: https://orcid.org/0000-0002-6034-0926

*****Kyiv Institute of the National Guard of Ukraine. Ukraine. ORCID: https://orcid.org/0009-0001-2095-9564

Recibido: 08/01/2026 Aceptado: 15/03/2026

Conflictos Híbridos: Los medios digitales como instrumento de manipoulación ¿Cómo ha sido en Ucrania?

Resumen

En las guerras híbridas, los medios digitales se han convertido en un poderoso instrumento de manipulación política de la conciencia, por lo que comprender en profundidad los mecanismos de dicha manipulación es fundamental para encontrar formas de contrarrestarla. El objetivo del artículo es identificar, clasificar y cuantificar las principales técnicas de manipulación utilizadas en los medios digitales en el contexto de un conflicto híbrido (tomando como ejemplo la guerra entre Rusia y Ucrania), así como evaluar las particularidades de su uso en diversas redes sociales. Con ese fin, se llevó a cabo un análisis de contenido de 2024 publicaciones de las plataformas Facebook, X (Twitter) y TikTok, clasificando las muestras en fuentes prorrusas y proucranianas. Los resultados revelaron una profunda asimetría: las técnicas manipuladoras estaban más extendidas en las fuentes prorrusas, en particular la deshumanización del enemigo, las teorías de la conspiración y la desinformación visual. TikTok fue la plataforma más manipuladora, utilizando la manipulación visual como principal herramienta de influencia. Se constató que existe una correlación negativa estable entre la confianza en la fuente de información y el grado de manipulación. Los resultados del estudio confirman el carácter sistémico de la operación de información y apuntan a la necesidad de una respuesta integral basada en la alfabetización mediática, el apoyo a los medios independientes, la cooperación internacional y la rendición de cuentas de las plataformas digitales.;

Palabras Clave: medios digitales; manipulación; conflicto híbrido; propaganda; desinformación; redes sociales

Introduction

The beginning of the 21st century was marked by significant changes in the paradigm of interstate and intersystemic conflict. Traditional all-out wars fought on the frontline with the participation of large armies are giving way to hybrid conflicts. In these new conditions, physical space is no longer the main battlefield - it is increasingly becoming the cognitive space, namely the inner world of the individual, his or her consciousness. The information sphere is turning into an arena of confrontation, where influence is exerted not only through facts but also through emotionally colored narratives; the main goal is the ability of society to rationally comprehend reality.

The understanding of the detailed mechanisms of information influence is a prerequisite to maintaining capacity for individual autonomy and critical thinking in an environment of chaotic information. The problem is more complex than merely condemning misinformation – it calls for a more profound examination of the impact that the architecture of information delivery (its emotional content, repetition, and source) has on the beliefs and decision-making of an individual. According to researchers of cognitive distortions and media processes, in the era of “post-truth” we are witnessing not only political manipulations, but also significant changes in the way societies produce and interpret knowledge about the world around them (Romański, 2025; Romański, 2025; Prohoniuk, 2025; Barnard, 2024; Pocheptsov, 2016). Thus, the analysis of these phenomena is of existential importance - it serves as a tool for building the information resilience of an individual in the face of modern challenges.

The aim of the article is to identify, classify and quantitatively analyze the main manipulative techniques used in digital media in the context of a hybrid conflict (on the example of the Russian-Ukrainian war), as well as to assess the specifics of their use by various social networks.

Literature Review

Interstate competition changed substantially at the beginning of the 21st century. Traditional wars with well-established front lines are being supplanted by “hybrid warfare” where the principal battlefield is no longer the physical soil, but the human mind. These shifts are driven by accelerating digitalization of life. Digital media are now firmly embedded in everyday life and offer unparalleled opportunities for communication, but also for manipulation. The business model of digital-platform capitalism monetizes human feelings and attention, and thus provides a perfect environment for information power, writes Zuboff (2019). People tell the story of their lives online, and whether they realize it or not, they are also being told what to do by algorithms and others who want to control their choices.

In this context, scholars around the world are increasingly alluding to the “era of disinformation” (Bennett and Livingston, 2020), in which the line between truth and lies is blurred by systematic information warfare. (Wardle & Derakhshan, 2017) encourage us to consider this in terms of “information disorder”, emphasizing that “facts coexist in the same information environment as rumors and disinformation, such that people have difficulty understanding the flow of information”. (Ecker et al, 2022) introduce a psychological dimension, according to which people tend to trust information that is consistent with their emotions and worldview, and correcting disinformation, even for blatantly false information, may not be an effective strategy due to people’s deeply ingrained cognitive processes.

The Russian-Ukrainian war is a tragic demonstration of how extreme and vivid it can be. (Geissler et al., 2023) similarly demonstrate the emergence of complex networks of coordinated social media accounts in Russian propaganda to sow panic, social disintegration, and justify aggression. (Horbyk et al, 2023), they traced the evolution of Soviet practices of “active measures” in the modern digital age: the levers of influence on the minds of the masses have only evolved. These findings are particularly insightful (Jakubowski & Zinichenko, 2024) for the adaptation of propaganda to the format of short videos on TikTok, indicating that emotional content and visual presentation were more persuasive than complex arguments.

The experience of understanding this information warfare in Ukraine is somewhat different. (Pasitselska, 2022) explores how ordinary Ukrainian citizens cope with their daily encounters with propaganda and develop their critical consciousness: some trust trusted sources of information, others trust the opinions of their loved ones, and some intuitively learn to distinguish between manipulations. The study (Zymomrya & Shevchuk, 2024) examines the transformation of Ukraine’s media landscape during the war - it becomes both a battlefield and a space for resistance.

The problem of manipulative influence is broader and goes beyond any conflict. conceptualizing the war of public opinion as an information age; victory depends not only on success on the battlefield, but also on the ability to maintain the interest and trust of the audience. (Mustață et al, 2023) focus on the cognitive aspects: how our thinking and education affect our ability to distinguish truth from falsehood, so that media literacy is not just a desirable skill, but rather a necessary condition for survival in the information environment.

At the level of national and international security, the problem is comprehended by (Polyakova & Boyer, 2018), who predict an escalation of political warfare in the digital age, and, who consider the war of public opinion as a determining factor of the information age. Current reports (European External Action Service, 2025) document the systemic nature of foreign interference and information manipulation, which poses a growing threat to democratic societies. Thus, the existing literature provides a solid theoretical basis for empirical research on manipulative techniques in digital media, but leaves room for further analysis of their specifics in the context of active hostilities and on various digital platforms.

Materials and Methods

The empirical study was carried out in multiple phases. To begin, a representative sample was constructed to represent the informational surroundings that an average citizen is exposed to on a daily basis. From January 2024 to February 2025, using a continuous sampling scheme 500 posts were collected from three platforms: Facebook, X (previously known as Twitter), and TikTok. These platforms represent different communication ecologies: Facebook accommodates longer texts and older audience; X offers instantaneous reactions and agenda-setting; TikTok welcomes visual and emotional processing of information, with reasoned debating replaced by emotional responses as the first immediate reaction. The chosen posts were subdivided into two classes according to origin. Group one (250 posts) consisted of pro-Kremlin and pro-Russian actors: official Russian media, pro-Russian bloggers and anonymous Telegram channels disseminating narratives that are consistent with Kremlin’s official position. The second group (250 posts) consisted of pro-Ukrainian and pro-West sources: Ukrainian state and non-state media, volunteer projects, bloggers and international journalists reporting from a Ukraine-sympathising position. This cleavage, albeit crude, is the division adopted in earlier research on information campaigns in hybrid conflicts (Bradshaw & Howard, 2018; Geissler et al., 2023), which show disinformation as coordinated activity between opposing sides.

To maintain validity and representativeness posts had to meet the following criteria: (1) the post had to be published after January 1, 2024 and before 18 February 2025; (2) the content had to pertain to the Russian-Ukrainian war (military operations, political decisions, humanitarian consequences, sanctions or historical/cultural narratives); (3) the account had to have at least 1,000 followers; (4) the post had to be publicly available; and (5) it had to be written in Ukrainian, Russian, or English. Posts were removed if they were solely commercial or entertainment, duplicates, had less than 10 engagements, were from deleted or suspended accounts, or linked to an outside website without any original commentary. For each source group, one in every ten posts that met the inclusion criteria was retrieved and organized randomly to reach 250 posts per side. In the end our sample was 500 posts -167 from Facebook, 167 from X and 166 from TikTok. An a priori power analysis with G*Power (version 3.1.9.7) revealed that at least 88 posts per group were necessary to detect a medium effect size (Cohen’s w = 0.30) with α = 0.05 and power = 0. From the power calculations this corresponds to a sample size of 250 per group for 500 in total. 95 for primary comparisons and allow robust sub-group analyses by platform, quarter, and account type. Statistical analyses consisted of descriptive statistics, chi-square tests, independent-samples t-tests, one-way ANOVA, Pearson correlations, and Cohen’s d for effect sizes and were performed using SPSS 26.0 and JASP 0.18.1, with the level of significance set at p < 0.05 and Bonferroni correction applied for multiple comparisons.

The primary instrument of analysis was quantitative content analysis, with qualitative discourse analysis as a supplementary method. Together this enabled to us determine frequencies and to also analyze deeper meanings, context and emotional tinge – all factors that shape the real effect of information on the human mind. Drawing on theoretical writings on information influence (Lewandowsky et al., 2017; Wardle & Derakhshan, 2017; Marwick & Lewis, 2017), we constructed a codifier with eight distinct manipulative techniques, each based on a particular psychological mechanism: emotional coloring (impeding cognitive reflection), half-truths / facts taken out of context (leveraging trust in real events while manipulating their meaning), visual deception (using unrelated or altered images), conspiracy theories (explanations for complex events with hidden forces), appeal to fear (overcoming rational mindset through threats), appeal to authority (referencing unknown “experts”), dehumanization of the other (conditioning the psyche of the public to accept violence) and signs of bot farms / fake accounts (simulating mass support or discussion).

To guarantee the objectivity and the reproducibility, a test for the inter-coder reliability was performed. Two independent coders (graduate-level researchers with full proficiency in Ukrainian, Russian, and English) completed a 4 hour training session on the coding scheme based on 20 practice posts that were not included in the final sample. A randomly drawn subsample of 100 posts (20% of the whole, stratified by platform, source group, and quarter) was double-coded. Inter-coder reliability was assessed two ways: percent agreement and Cohen’s κ. Substantial or higher levels of agreement were reached in all categories (κ ≥ 0.72) with the majority obtaining almost perfect agreement ( κ ≥ 0.81): emotional coloring (κ = 0.88), half-truths (κ = 0.84), visual deception ( κ = 0.92), conspiracy theories ( κ = 0.95), appeal to fear ( κ = 0.81), appeal to authority (κ = 0.76), dehumanization ( κ = 0.90), bot signs ( κ = 0.72), post tone ( κ = 0.83), main narrative ( κ = 0.79), and source reliability ( κ = 0.86). Disagreements were resolved through consensus, and the final reliability reflects that consensus. These high coefficients demonstrate the robustness of the coding tool. All 500 posts were hand coded, as none of the automated systems available can capture all the nuances of language, intonation and implicit messages in human communication. We took a conscious decision to view this data flow through the eyes of a layman - a person who scrolls through the news feeds every day, shares impressions with friends and family, and slowly forms an understanding of the world based on what they read and see.

A structured data recording sheet is available in Appendix A, and a summary table of primary research data (N=500) is presented in Appendix B.

Results

The content analysis of 500 posts from Facebook, X (Twitter) and TikTok social platforms, selected in accordance with the methodology described in Section III, allowed us to obtain a number of important quantitative and qualitative indicators on the use of manipulative techniques in the coverage of the Russian-Ukrainian war. A summary table of the primary research data with full identification data and selection parameters (N=500) is presented in Appendix B.

The analysis was carried out using the developed codifier, which covered eight main manipulative techniques identified on the basis of theoretical developments (Lewandowsky et al., 2017; Wardle & Derakhshan, 2017; Marwick & Lewis, 2017). Below are the main results in the form of a series of analytical tables with relevant explanations.

Figure 1. Total frequency of using manipulative techniques (N=500)

As can be seen from Figure 1, the most common technique of influencing public opinion was emotional coloring, present in 68% of the analyzed posts. This confirms the thesis (Lewandowsky et al., 2017) that in the digital era, information is disseminated not so much through rational reflection as through emotional response. A person who scrolls through a news feed every day primarily reacts to what evokes strong feelings - anger, compassion, fear, or indignation. It is these emotions that become the mechanism that dampens critical thinking and forces people to share information without even checking it.

The second most popular technique were half-truths or facts distorted by the use of quotation marks (45%). This is a particularly insidious weapon because it relies on trust: a person witnesses a real event or fact, but it is stripped of necessary context, which fundamentally alters its meaning. This form of information pollution is the hardest to recognize and counteract, because it has a grain of truth in it (Wardle & Derakhshan, 2017).

The dominance of visual deception (40%) and fear appeals (35%) is no less accidental. Visual information appears to be more trustworthy than text and elicits more emotional response. On the other hand, fear has been shown to block people’s ability to think critically, making them prone to seek out simplistic answers and “defenders”.

Table 1. Comparison of the frequency of use of techniques by source groups (%)

Manipulation methods

Pro-Russian sources (n=250)

Pro-Ukrainian sources (n=250)

The difference

Emotional tone

74%

62%

+12%

Half-truths / quotes out of context

68%

22%

+46%

Visual trickery

62%

18%

+44%

Fear mongering

48%

22%

+26%

Portrayal of antagonists as subhuman

58%

6%

+52%

Conspiracy theories

44%

4%

+40%

Appeal to authority

20%

24%

–4%

Members of botanical farms

34%

4%

+30%

Total techniques per 1 post on average

4.08

1.62

2.5

Table 1 reveals a pronounced imbalance between the two sets of sources. Pro-Russian source accounts employ roughly 2.5 times more manipulation techniques per post than pro-Ukrainian accounts (4.08 vs. 1.62). This demonstrates not just differences in communication patterns, but also the systematic and orchestrated communication warfare, as outlined in (Bradshaw & Howard, 2018) & (Geissler et al., 2023).

The biggest differences are in dehumanizing the enemy (+52%), half-truths (+46%) and visual distortion (+44%). Dehumanization - portraying the opposing side as “non-humans”, “fascists”, “orcs” – is a particularly dangerous technique, as it psychologically prepares society to accept violence and deprives the enemy of the right to sympathy. As noted by Horbyk et al. (2023), this tactic has deep roots in Soviet methods of “active measures” and demonstrates the continuity of propaganda approaches.

Conspiracy theories, which are practically not represented in pro-Ukrainian content (only 4%), account for 44% of pro-Russian publications. This confirms the observation (Pasitselska, 2022) that Russian propaganda actively uses conspiracy narratives to create an alternative reality where simple answers to complex questions are presented as truth and people receive the illusion of “secret knowledge”.Figure 2. Distribution of manipulative techniques by platforms.

Figure 2. Distribution of manipulative techniques by platforms

Figure 2 emphasizes the importance of the platform in the dissemination of manipulative content. TikTok is the most worrisome, showing the highest average score of manipulative techniques (3.3), in particular among pro-Russian sources (4.9). This confirms the findings of a study (Jakubowski & Zinichenko, 2024) on the evolution of propaganda in the era of short videos, where emotional messages and visuals are more effective than complex arguments.

The amount of visual deception on the pro-Russian part of TikTok is staggering – 88%. This suggests that in just about one in nine videos out of ten produced by pro-Russian sources contains manipulated images or video sequences; recirculated footage is passed off as new, footage from other locations is labeled as that from the location that is under threat, or completely generated. For a generation of young people whose main source of information is TikTok, this thickness of manipulation represents a serious risk of them getting a distorted view of the world.

Facebook and X follow suit, though with somewhat lower percentages.

Figure 3. Breakdown by account type and source group

As can be seen from Figure 3, both categories are represented by different types of accounts. However, it should be noted that pro-Russian posts have a slightly higher number of anonymous sources (16 vs. 12). Anonymous telegram channels and publics, as Geissler et al. (2023) emphasize, are often the carriers of the most aggressive manipulations, as they do not bear formal responsibility for their content and can act more radically.

Table 2. Breakdown by tone of posts

Tonality

Pro-Russian (n=250)

Pro-Ukrainian (n=250)

Total (N=500)

Aggressive

120 (48%)

25 (10%)

145

Accusatory

60 (24%)

40 (16%)

100

Neutral

20 (8%)

75 (30%)

95

Compassionate

10 (4%)

60 (24%)

70

Patriotic/uplifting

40 (16%)

50 (20%)

90

Total

250

250

500

The tone of the posts (Table 2) is an important indicator not only of the content, but also of the emotional state that these posts convey to the audience. Almost half of the pro-Russian posts (48%) have an aggressive tone, which correlates with a high level of dehumanization and fear appeal. Such aggressive rhetoric does not just inform, but creates a state of hostility and readiness for confrontation in the reader.

Meanwhile, pro-Ukrainian sources are far more likely to be neutral (30%) or sympathetic (24%). The contrast in sympathetic tone (24% vs. 4%) is especially illuminating and is a reflection on human stories, losses and help – in other words, what Pasitselska (2022) describes as “sensory understanding of war through everyday experiences”.

Table 3. Distribution of manipulation techniques by source reliability level

The level of reliability

Pro-Russian

Pro-Ukrainian

Average number of technicians

High

4 (8%)

28 (56%)

1.2

Medium

12 (24%)

18 (36%)

2.4

Low

34 (68%)

4 (8%)

4.7

Our results show a clear pattern: the lower the credibility of a source, the higher the number of manipulation techniques applied (Table 3). Uncredible sources apply an average of 4.7 manipulation techniques per post, which is more (almost four times more) than the average of credible sources (1.2) (see Figure 5).

This confirms a common-sense but significant thesis: the less truthful the information, the more manipulative means are needed to sell it to the public. The gap in distribution of sources by level of credibility is also glaring: 68% of pro-Russian sources are categorized as low credibility, while 56% of pro-Ukrainian sources are categorized as high credibility. This does not mean that the pro-Ukrainian sources are completely without spin, but it does suggest an enormous difference in factual accountability and reliance on confirmed information.

Table 4. The most common combinations of techniques

Rank

Combination of techniques

Frequency

Preferred side

1

Emotionality + Half truth + Misleading photos

28

pro-Russian (89%)

2

Emotionality + Appeal to fear

24

pro-Russian (75%)

3

Emotionality + Dehumanization

22

pro-Russian (95%)

4

Emotionality + Conspiracy theories

18

pro-Russian (94%)

5

Emotionality + Appeal to authority

16

pro-Ukrainian (56%)

Table 4 indicates that manipulatives are applied infrequently alone. More often they create stable pairs with each other. The triad “Emotionality + Half-truth + Visual deception” is the most frequent and appears almost exclusively in pro-Russian sources (89%). This produces an effect of an “information trap” in which emotion commands attention and disables critical thinking, images generate a sense of credibility, and facts taken out of context make it difficult to rapidly verify.

Table 5. Distribution of the main narratives

Main Narrative

Pro-Russian (n=250)

Pro-Ukrainian (n=250)

Total (n=500)

War crimes (accusations of the other side)

140

75

215

Help / volunteering

10

90

100

Political decisions / diplomacy

40

50

90

History / culture / identity

40

20

60

Economy / sanctions

20

15

35

Total

250

250

500

The analysis of the main narratives (Table 5) reveals different focuses of attention of the two groups of sources. For pro-Russian sources, the dominant narrative is “war crimes of the other side” (56%). As can be seen from the previous tables, these accusations are most often supported by half-truths, visual deception and dehumanization. This strategy is in line with the classic propaganda formula “accuse the enemy of what you do”.

Russian-language and pro-Russian sources, on the other hand, focus more on the story of “help/volunteering” (4% vs. 36%). This represents a mobilization of internal resources to care for defenders and victims, and to establish a community of mutual aid. As Pasitselska (2022) points out, during war the horizontal ties and daily practices of assistance become a significant wellspring of resilience and trust.

Table 6. Dynamics of the use of techniques by quarters of 2024

Quarter

Pro-Russian (approximate number of technicians)

Pro-Ukrainian (average number of technicians)

The difference

Q1 (January-March)

3.9

1.5

2.4

Q2 (April-June)

4.2

1.7

2.5

Q3 (July-September)

4.1

1.6

2.5

Q4 (October-December)

4.1

1.7

2.4

Table 6 shows an important finding: the identified trends remain stable throughout 2024. This refutes the possible assumption that the high concentration of manipulative techniques is random or related to individual events. Instead, we see a systematic, institutionalized nature of the information policy of pro-Russian sources that consistently use manipulation regardless of the specific situation on the frontline or in politics.

Pro-Ukrainian sources demonstrate a consistently lower level of manipulative techniques (in the range of 1.5–1.7), which may indicate both a conscious editorial policy focused on facts and lower resource capacities for conducting complex information operations.

Discussion

Emotional coloration, half-truths, and visual deception are the most common, which confirms the thesis (Lewandowsky et al., 2017) about “post-truth” as a characteristic feature of modern communication. Behind these techniques is an attempt to influence not the rational thinking of a person, but his or her emotional sphere - to cause anger, fear, compassion or indignation. And it works, because an emotionally overwhelmed person is less likely to ask if the information is trustworthy. They post, share, comment and react and become part of the flow of information, often without even realising they are being manipulated. As Wardle & Derakhshan (2017) note, today’s ‘information disorder’ is also defined by the fact that emotionally charged content travels further and faster than content that is more neutral and balanced. Understanding this is the first step to defending against it: We can all learn to recognize when information is appealing to our emotions more than our minds, and to pause before we react to or share it. A notably deep asymmetry between pro-Russian and pro-Ukrainian sources emerged as especially significant. The fact that pro-Russian sources employ 2.5 times as many manipulative techniques, and in particular such dangerous ones as dehumanisation of the enemy, conspiracy theories, and systematic visual manipulation, speaks to more than just distinct communication strategies. It is a question of a radically different treatment in relation to the public.

Dehumanization isn’t only a rhetorical device. Equally disturbing is the finding that TikTok is the most saturated manipulative environment, especially in the pro-Russian segment, where 88% of posts contain visual deception. Behind these percentages are real young people - the platform’s main audience - who consume short videos every day, often without the time or skills to critically reflect on what they see. (Jakubowski & Zinichenko, 2024) rightly point to the transformation of propaganda in the era of short videos: an emotional message and visual image work more effectively than complex arguments, and the platform’s algorithms, focused on maximum engagement, objectively promote the dissemination of such content. A young person who sees dozens of videos every day is unable to check each of them, and gradually his or her picture of the world is shaped by systematic manipulation. The responsibility here cannot lie solely with the individual consumer.

The clear inverse correlation between the level of credibility of a source and the number of manipulative techniques revealed in the study poses an almost insoluble task for the average person. Low credible sources (mostly pro-Russian) apply manipulative techniques four times more than high credible ones. As a result, the more people are manipulated, the more difficult it is for them to see the reality as manipulators’ messages are typically more colorful, emotional and “convincing” at first glance. (Ecker et al, 2022) explain this with two psychological mechanisms: information with strong emotions is better remembered, and it takes cognitive effort to refute it, which not everybody can do in the stream of daily information. The solution to this predicament is to cultivate new information habits. Rather than mindlessly scrolling through the news feed, become democratic in your media diet: Select a handful of reliable sources you trust, check who is behind the information, and use fact-checking tools before believing a catchy post.

In particular we note the stability of the evolution over the whole of 2024. This refutes the possible assumption that the high concentration of manipulative techniques is accidental or related to individual events. The information war is ongoing, regardless of the specific situation at the front or in politics, and digital media is its main battlefield. A person living in this information environment is under systematic pressure on a daily basis, and this pressure does not ease when the hot phase of the conflict subsides. As Prohoniuk (2025) rightly points out, the war of public opinion in hybrid conflicts is a defining factor of the information age, and its intensity does not depend on direct hostilities. This awareness should provoke the acknowledgment of cognitive security as a part of national security. The government would be expected to formulate national strategies for safeguarding the minds of its citizens, protect the independent media and fact-checking organizations, and make media literacy a compulsory part of education at all levels. For teachers and parents, the bright side of this is that media literacy simply becomes part of life, and that parents and teachers themselves become more adept at spotting manipulation, thinking critically themselves with the help of play and face-to-face interaction, and cultivating discussion circles free of the fear of judgment. And for the government: required development of a national cognitive security strategy, protection of independent media and fact-checking organizations, introduction of media literacy as a mandatory part of education, and enhanced international collaboration to detect and neutralize harmonized disinformation campaigns. To science, this translates in the direction of expanding the sample and method, assessing impact on varied audiences, investigating the effectiveness of countermeasures, and researching early warning signs of manipulative narratives.

Conclusions

Analysis of 500 posts on Facebook, X and TikTok demonstrates that emotional appeal, half-truths and visual misrepresentation are the most common manipulative tactics in coverage of the Russia-Ukraine war. These findings align with the ‘post-truth’ narrative within this conflict (Lewandowsky et al., 2017), though the limited sample size is acknowledged.

A strong asymmetry exists between pro-Russian and pro-Ukrainian sources: the former employ 2.5 times more manipulative techniques per post than the latter, particularly dehumanisation, conspiracy theories and visual deception. This indicates a systemic information operation with deep historical roots (Geissler et al., 2023; Horbyk et al., 2023), though causal claims require future longitudinal research.

TikTok is of particular concern, where 88% of pro-Russian posts contained visual deception. This confirms the transformation of propaganda in the short-video era (Jakubowski & Zinichenko, 2024), at least within this sample.

A notable inverse relationship exists between source credibility and manipulation: low-credibility sources engaged in four times more manipulation than high-credibility ones. This suggests a high cognitive burden for the average reader, though experimental research is needed to assess effects on perception (Ecker et al., 2022).

The stability of trends throughout 2024 indicates that the high concentration of manipulative techniques was systemic rather than event-driven. This suggests that digital media serve as a persistent battlefield in hybrid conflicts, at least in the Russia-Ukraine case.

These findings suggest a multi-tiered response: at the personal level, emotional self-regulation and conscious consumption; at the institutional level, media literacy education and the support of independent media; and at the global level, collaborative efforts to dismantle coordinated disinformation campaigns (European External Action Service, 2025). Further studies with a larger sample size and experimental design are needed to generalise these results beyond the present conflict.

REFERENCES

BARNARD, S.R. (2024) Hacking hybrid media: Power and practice in an age of manipulation. Oxford: Oxford University Press. Available at: https://doi.org/10.1093/oso/9780197570272.001.0001

BENNETT, W.L. AND LIVINGSTON, S. (eds.) (2020) The disinformation age: Politics, technology, and disruptive communication in the United States. Cambridge: Cambridge University Press. Available at: https://doi.org/10.1017/9781108914628

BRADSHAW, S. AND HOWARD, P.N. (2018) ‘The global organization of social media disinformation campaigns’, Journal of International Affairs, 71(1.5), pp. 23–32. Available at: https://www.jstor.org/stable/26508115

ECKER, U.K.H., LEWANDOWSKY, S., COOK, J., VAN DER LINDEN, S., ROOZENBEEK, J. AND ORESKES, N. (2022) ‘The psychological drivers of misinformation belief and its resistance to correction’, Nature Reviews Psychology, 1(1), pp. 13–29. Available at: https://doi.org/10.1038/s44159-021-00006-y

EUROPEAN EXTERNAL ACTION SERVICE (2025) 3rd EEAS report on foreign information manipulation and interference threats: Exposing the architecture of FIMI operations, March. Available at: https://www.eeas.europa.eu/sites/default/files/documents/2025/EEAS-3nd-ThreatReport-March-2025-05-Digital-HD.pdf

GEISSLER, D., BÄR, D., PRÖLLOCHS, N. AND FEUERRIEGEL, S. (2023) ‘Russian propaganda on social media during the 2022 invasion of Ukraine’, EPJ Data Science, 12(1), Article 35. Available at: https://doi.org/10.1140/epjds/s13688-023-00414-5

HORBYK, R., PRYMACHENKO, Y. AND ORLOVA, D. (2023) ‘The transformation of propaganda: The continuities and discontinuities of information operations within Soviet/Russian active measures’, Nordic Journal of Media Studies, 5(1), pp. 68–94. Available at: https://doi.org/10.2478/njms-2023-0005

JAKUBOWSKI, J. AND ZINICHENKO, V. (2024) ‘From classic to TikTok propaganda: Russian aggression in Ukraine and new media perspective’, Polityka i Społeczeństwo, 3(22), pp. 98–111. Available at: https://repozytorium.ur.edu.pl/handle/item/11330

LEWANDOWSKY, S., ECKER, U.K.H. AND COOK, J. (2017) ‘Beyond misinformation: Understanding and coping with the “post-truth” era’, Journal of Applied Research in Memory and Cognition, 6(4), pp. 353–369. Available at: https://doi.org/10.1016/j.jarmac.2017.07.008

MARWICK, A.E. AND LEWIS, R. (2017) Media manipulation and disinformation online. New York: Data & Society Research Institute. Available at: https://datasociety.net/library/media-manipulation-and-disinfo-online/

MUSTAȚĂ, M.-A. Et al. (2023) ‘Assessing the truthfulness of security and defence news in Central and Eastern Europe: The role of cognitive style and the promise of epistemic sophistication’, Applied Cognitive Psychology, 37(6), pp. 1384–1396. Available at: https://doi.org/10.1002/acp.4130

PASITSELSKA, O. (2022) ‘Better ask your neighbor: Renegotiating media trust during the Russian–Ukrainian conflict’, Human Communication Research, 48(2), pp. 179–202. Available at: https://doi.org/10.1093/hcr/hqac003

POCHEPTSOV, H.H. (2016) Meanings and wars: Ukraine and Russia in information and meaning wars. Kyiv: Vydavnychyi dim “Kyievo-Mohylianska akademiia”. Available at: http://www.eebooks.de/Smysly-i-viiny-Ukraina-i-Rosiia-v-informatsiinii-i-smyslovii-viinakh

POLYAKOVA, A. AND BOYER, S.P. (2018) The future of political warfare: Russia, the West, and the coming age of global digital competition, March. Washington, DC: The Brookings Institution. Available at: https://www.brookings.edu/wp-content/uploads/2018/03/the-future-of-political-warfare.pdf

PROHONIUK, L. (2025) ‘The role of media communications in ensuring the information security of the state in a hybrid warfare: Challenges, threats and ways to increase the sustainability of society’, Society. Technology. Solutions. Proceedings of the International Scientific Conference, 3, pp. 44–45. Available at: https://doi.org/10.35363/ViA.sts.2025.129

ROMAŃSKI, R. (2025a) ‘Mechanisms of disinformation amplification in hybrid warfare: The case of the conflict in Ukraine’, Bulletin of «Carol I» National Defence University, 14(2), pp. 123–138. Available at: https://doi.org/10.53477/2284-9378-25-13

ROMAŃSKI, R. (2025b) ‘Mechanisms of disinformation amplification in hybrid warfare: The case of the conflict in Ukraine’, Bulletin of «Carol I» National Defence University, 14(2), pp. 139–154. Available at: https://doi.org/10.53477/2284-9378-25-13

TUMBER, H. AND WAISBORD, S. (eds.) (2021) The Routledge companion to media disinformation and populism. London: Routledge. Available at: https://doi.org/10.4324/9781003004431

WARDLE, C. AND DERAKHSHAN, H. (2017) Information disorder: Toward an interdisciplinary framework for research and policymaking. Strasbourg: Council of Europe. Available at: https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c

WOOLLEY, S.C. AND HOWARD, P.N. (eds.) (2019) Computational propaganda: Political parties, politicians, and political manipulation on social media. Oxford: Oxford University Press. Available at: https://doi.org/10.1093/oso/9780190931407.001.0001

ZUBOFF, S. (2019) The age of surveillance capitalism: The fight for a human future at the new frontier of power. New York: PublicAffairs. Available at: https://www.hbs.edu/faculty/Pages/item.aspx?num=56791

ZHUYKOVA, M. (2024) ‘Potochne znannya vs. naukove znannya v definitsiyakh zahalʹnykh tlumachnykh slovnykiv [Commonsense knowledge vs. scientific knowledge in the definitions of general dictionaries]’, Synopsys: Text, Context, Media, 30(1), pp. 34–42. Available at: https://doi.org/10.28925/2311-259x.2024.1.5

Appendix

ATA REGISTRATION FORM

Research topic: Digital media as a tool for manipulative influence on public consciousness in hybrid conflicts

Coder: _________________________

Date of coding: _________________________

Section 1. Identification data

No. of the company

Field name

Description / Choices

Code

1.

Post ID

Unique identifier (e.g. UA001-RU001)

2.

Post title/text

Summary or first words

3.

Post URL

Direct link

4.

Publication date

DD.MM.YYYY

5.

Publication time

YY:XX

6.

Author/source

Name of the public, blogger, media

7.

Account type

1 = official / 2 = personal blogger / 3 = anonymous / 4 = media / 5 = other

8.

Number of subscribers

Approximate quantity (if available)

9.

Number of interactions

Likes + reposts + comments (total)

10.

Post language

1 = Ukrainian / 2 = Russian / 3 = English / 4 = other

Section 2. Selection parameters

No. of the company

Field name

Description / Choices

Code

11.

Platform

1 = Facebook / 2 = X ( Twitter ) / 3 = TikTok

12.

Hashtags

List of hashtags in the post

13.

Relevance to the topic

1 = yes / 0 = no (if no, the post is excluded)

14.

Group by source

1 = pro-Kremlin/pro-Russian / 2 = pro-Ukrainian/pro-Western

Chapter 3. Manipulative techniques (codifier)

Instructions: Mark 1 (yes) if the technique is present in the post, or 0 (no) if it is not.

No. of the company

Manipulative technique

Presence criteria

Yes (1)

No (0)

15.

Emotional coloring

Use of words, images, emojis that evoke strong emotions (anger, sympathy, indignation, joy). The text is not aimed at informing, but at emotional engagement.

16.

Half-truth / facts taken out of context

Real events or facts are mentioned, but without the necessary context, which changes their meaning; manipulation of numbers, dates, places.

17.

Visual deception

The photo or video does not match the description (old footage, from other locations, stock images, AI-generated); there is no link to the original source; the caption contradicts the image.

18.

Conspiracy theories

References to “secret governments”, “global conspiracy”, “bio-laboratories “, “secret administration” without evidence; explanations of events through the activities of hidden forces.

19.

Appeal to fear

Using threatening images, predictions, intimidation with consequences (hunger, death, loss of control); creating an atmosphere of imminent danger.

20.

Appeal to authority

References to “experts”, “eyewitnesses”, “well-known sources” without specific names or verification; mention of famous individuals who are not relevant to the topic.

21.

Dehumanization of the enemy

Depiction of the opposing side as “inhumans”, “creatures”, “orcs “, “fascists”, “subhumans”; deprivation of the right to compassion.

22.

Signs of bot farms /fakes

Signs of inauthentic activity: many similar comments, accounts with no history, strange names, too fast spread.

Section 4. Additional features

No. of the company

Field name

Description / Choices

Code

23.

Number of techniques in one post

Sum of “Yes” marks from section 3 (from 0 to 8)

24.

Main narrative

1 = war crimes / 2 = aid/ volunteerism / 3 = political decisions / 4 = history/culture / 5 = economy/sanctions / 6 = other

25.

Post tone

1 = aggressive / 2 = accusatory / 3 = neutral / 4 = sympathetic / 5 = patriotic/elevated

26.

Availability of links

1 = there is a link to the source / 0 = no links

27.

Source reliability (subjective assessment)

1 = high level / 2 = medium / 3 = low (based on fact-checking)

28.

Encoder comments

Additional observations, important details, quotes