In this article, we frame the research framework using the newsworthiness theory and the news media on natural disasters analysis. The newsworthiness theory developed by Galtung and Ruge (1965) suggests news are worth reported based on psychology of individual perception such as importance in one factor could be offset by another factor. In further detail, Kwak and An (2014) argues these factors are frequency, intensity, unambiguity, meaningfulness, consonance, unexpectedness, and the continuity of an event. This leads to the decision for media companies to report which news items through their channels such as newspapers and TV news.
However, in a more complex analysis, Wu (1998) argues that newsworthiness alone is not adequate to understand the determinants of international news flow. The two important factors on how international news are channelled are; first is the gatekeepers’ perspective that include newsworthiness, socio-cultural structure, organizational constrains and the agenda-setting that international news services have. The attention and interest of international news regarding a foreign country may be steer by the news media agenda and the deviance of the event. Second, logistical factors argue that the clout of a country determines the news media interests of coverage including trade relationship, cultural background, and geographical proximity and regionalism.
While news flow and coverage in natural disaster are expected not only communicating the event and its physical and socio-economic impact, but also warning and contributing to individual and community preparedness, recovery, and resilience (Houston, Pfefferbaum and Rosenholtz, 2012). The paper further suggests that on average, news covers are shorter periods of time than other issues, coverage tended to focus on the current impact of disasters on humans, the built environment, and economics was an important topic. On the other hand, the study highlighted that there is little coverage of what caused the disaster, or what influenced responses to the disaster and what the disaster means for the people and communities experiencing the disaster. Thus, this may weaken the disaster narrative that hinders people to learn through from the news coverage any preparedness for themselves and understanding national or societal implications of those who experience the disaster.
There are several issues arose between media and disaster reports. First, demand for high speed and accurate information update. The study by Binderkrantz (2020) shows that traditional news such as newspaper, has been regarded slow for update public on disaster news as once it was disseminated, the information would already outdate. Furthermore, as media is strongly tied with interest groups, their framed messages would consequently highlight some features of reality and downplay others. In this sense, news frame could be seen as a weapon to generate support for specific action or policy. Second, the news coverage of natural disasters could define and limits the discourses related with these events. As natural disasters caused destruction and inhibits daily services, media coverage allows news of need for recovery of capital, social and build environment. For instance, the paper by Miles and Morse (2006) argues the these how media coverage that priorities recovery of various forms of capital – natural, human, social, and built shape the public’ perception natural risks and how this influence individual strategies over mitigation of future vulnerabilities.
Presently the emerging on technology 4.0 has allow information technology to be distributed and accessible worldwide. Following the technology advancement, news coverage and its analytical methods has extended in the past decade. The abundance of data availability and new analytical methods allows one to analyse data in a much efficient and new way. This may be represented by datafication, referring to the use of digital technologies to release the knowledge associated with physical objects by decoupling them from the data associated with them. Explaining further, Mayer-Schonberger and Cukier (2013) argues that datafication shows that the indirect uses of texts and location data have nothing to do with the data itself, the purpose for which the information was initially generated. New uses emerge and new value can be created. For instance, as news texts become online copy, it is known as “digitized”, while as these texts are indexed and searchable, this became datafication. Similarly, this shift is found in transportation sector as when defining shortest route and tracking became digitized, the revenues from loyalty programs, restaurant recommendations, and advertisement became datafication.
This shift is also found in the journalism sector as the information on broadcast, print and web news that are available offline and online, are accessible to be gathered and analyse. One of the sources of big data is the GDELT database, a platform that hosts millions of news worldwide since the 1980s. The comparison of GDELT with other similar sources show that GDELT is a powerful database. For instance, the paper by Kwak and An (2016) analyze that news obtained from GDELT and Event Registry (ER) are similar, despite both datasets are quite different in terms of scale and news sources. The GDELT collects 2.26 times to 6.43 times more documents than ER per day. The GDELT database also publishes documents in more varied languages per day on average, 64.1 and 14 languages, respectively.
There are many studies on various disciplines has deploy GDELT data. The study by Aritenang et al (2018) shows that news on a particular country in a country is important to shape the perspective and appropriate policy towards the former country’s foreign perspectives. Furthermore, the study by Kwak and An (2014) suggests that the unexpectedness of an event that continuous for days are reinforced its newsworthiness over time. While Kwak and An (2016) shows that the database is dominated by English language news (more than 2 million articles) followed by Spanish (646.000) and Arabic (320.000). The paper further shows that that the Dailymail (UK) and Reuters (USA) are the largest source of article.
In the disaster study, (Arva et al 2013), suggest that one has to carefully analyze the GDELT database. The study by Kwak and An (2014) highlights that as the US media are the most tracked and available from the GDELT database, the number of news on a certain disaster might be influenced by US news media. One way to avoid this bias is to use other global news that reports the disaster. Another study is to predict the value of news within the next few hours. The study by Nourbakhsh et al (2014) argues computation journalism using GDELT data has improves how journalism operates. The study uses NLP and machine learning techniques that allows the model to learn and predict disaster news that would have high values, obtained from a feed of local authorities and news media. For instance, the paper compares the disaster feed and Reuter News, and found that out of 18 disasters between the period of analysis, there were 8 events where the disaster feed was ahead of the news office.