Initial Use of Big Data and Open Data on Disaster Risk Management

. As more sensor networks, the Internet of Things (IoT) and smart devices are applied for data collection, including physical parameters or social responses, it is a growing trend that big data is dominating quality of disaster risk reduction, emergency preparedness and emergency operation. How to appropriately use big data depends on quality, transmission, exchange, storage, display and dissemination during the whole process data life cycle. “ Data life cycle ” means approaches to improve knowledge to use big data. Moreover, to solve issues related to disaster risk management is required to integrate diverse data sets based on the characteristics of individual requests. For example, an execution of early eviction before a typhoon makes landfall, demographic structure, weather forecast, real-time rainfall monitoring, threshold values to trigger ﬂ ood or landslide and supporting resources are the essential elements for a successful operation. However, lots of digital preparedness need comprehensive coordination and arrangements in advance. In order to further make the most use of data for enhancing public awareness and decision formation, producing open data is also important to reach targeted groups of users through understating key information intelligence to take right reactions. Since embracing policy of open data, it needs enablers to set up a well-regulated environment among original data producers, data aggregator, app or system developers, transmission channel providers and ﬁ nal message receivers. An developing example tells the recent developments on big data and open data for disaster risk management.


Big Data and Open Data Fully Support Disaster Risk Management
Looking at digital preparation of big data for all phases of disaster risk management, it definitely demands longitudinal and latitudinal coordination among all stakeholders that usually takes time to build consensus on enabling an environment to maintain a smooth and data flow.
Outputs offered by the scientific community could include numerical simulations and data analysis with historical, statistical and spatial factors. That is one source of big data. The results could help at planning stage to understand potential risks, both physical or social; at emergency responding times to highlight possibly affected areas-disaster hot spots; and at period of post-disaster recovery to prevent emerging risks and implement "Build back better." In recent decades, inter-disciplinary endeavors made by scientific community have gradually explored characteristics of natural hazards and all aspects of vulnerabilities. Through a systemic approach, it assists in unfolding likely impacts to infrastructure, livelihoods, economy and sustainable development. Nevertheless, any scientific model has its limits and assumption. There is no one-size-fit-all solution to cope with challenges resulted from natural hazards, physical environment or social aspects. Because no tow disasters or affected areas could be alike. Ever since taking scientific outputs for disaster risk management, disaster manager should be well aware of probable misleading or errors, which happened at any time.
To enhance performance of scientific results, inputs from monitoring networks offers instant calibrations in numerical models and close-ups at developing situations outside. Depending different measurements on rainfalls, earthquakes, water levels in river basins or reservoirs, soil moisture and other parameters, the over-whelming inputs might hamper decision making due to "too much information to digest" [1]. It happens if there is no well-organized mechanism to allocate resources to massage continuously incoming big data. But through real-time or near-real-time inputs, it does fill gaps that numerical methods fail to achieve. Even now the closed-circuit television (CCTV) is showing its great potential in collecting instant videos by remote control. Dynamic image recognition is another effective approach to detect immediate risks such like floods, landslides, storm surges, and rising water levels that will assist commanding officers in focusing on disaster hotspots. Figure 1 is a systematic diagram to explain how to integrate relevant ingredients to succeed disaster risk management.
Other than supports of science and technology, awareness of decision makers is an additional factor to carry out all necessary actions and reactions, which fulfill evidencebased management. Thus, all information should be neat and easily updatable, and content design meets demands for emergency preparedness and emergency response [2].

Operational Model on Big Data and Big Data
At information age today, more big data sets, both structured or non-structured types, have created new dimensions to measure or detect dynamic changes on physical and social risks. Considering ways to collect big data, apparently there are numerical channels to gather necessary information. To avoid overwhelming situations resulted in too much coming-in messages, a certain mechanism should be designed in advance to set up standards of data formats, communication protocols, exchange platform and storage policy [3,4].
The data-collection process requires a "N to One" policy. "N" means all different data sets which could be applied for disaster risk management. However, the quality big data has great influence on making good decision. Definition of "quality big data" satisfies accurate precision, routine maintenance, easy accessibility and rapid readiness. Among the four factors, rapid readiness is the key when to produce informative outputs that requires lots of pre-set defaults requiring consensus and collaboration among individual big data producers. For example, to accelerate data application, data dissemination through cloud storage architecture, efficient algorithm to encrypt and decrypt data, and secure information exchange platform [5,6]. Therefore, "One" means a leading agency to coordinate differential requests of all partners and monitor "health" of all data sets. And the "One" has to draw a comprehensive plan on how to "cook" all data to produce information intelligence which helps diverse end usersdecision makers, citizens or emergency responders, to take actions. With appropriately assigned "One", the agency play a neutral role to settle down arguments and disagreements among key stakeholders. Furthermore, the process from data produces to designed users is an end-to-end framework for bringing and collecting information. Figure 2 explains the conceptual ideas of "N-to-One and One-to-Many".
About meeting end users' demands, it is essential to generate tailor-made information for specific user groups. Messages to the general public should be single and actionable; notices to emergency responders must be clear to support taking actions; information to commanding officials has to offer integrated and comprehensive situation for decision on allocating resources, equipment or personnel to mitigate possible impacts. Besides information preparation, wide-coverage and multiple channels are the last mile to reach different pre-defined user groups.
The possible channels for information dissemination are the Internet, Public Warning System, Cell Broadcast Service, TV and other devices are able to display or announce information or disaster warning. In order to enable a robust environment to inform end users, basic standards or protocols are required to formulate rules to follow. This is a process to make big data into open data. For examples through introducing Common Alerting Protocol (CAP), the protocol will make disaster alerts communicable to all devices connected to broadband Internet or WiFi. Since CAP is a communication protocol, actionable commands or instructions could be embedded to activate mechanical equipment.

Key Considerations to Use Big Data and Open Data for Disaster Risk Management
Aforementioned discussions have addressed key factors to apply big data and open data for disaster risk management. There are other considerations should be addressed. Figure 3 depicts the key considerations on big data and open data. For better information intelligence to communicate with end users, "Data presenting" is essential to deliver information in a efficient and effective way. Especially, GIS is broadly applied to disaster risk reduction and emergency operation. A illustrative GIS map with spatial and time factor offers decision supports to commanding officials. Risk communication with citizens also requires simple and understandable graphs to assist in risk perception. Pure text reports are not suitable for rapidly creating risk understanding [7][8][9].
Data exchange and sharing are always a critical issue to bring cross-cutting interagency collaboration. However, through introducing well-defined protocols that will guide circulation and application in information and also offer inventory list of all data sets ready for use. Internet security is another focal issue. Since some big data sets could be intelligence sensitive and should be well protected, any information leakage or operation interruption by anonymous hackers require higher anti-hacker countermeasures.
To conduct impact-assessment-based preparedness, integration between physical and social vulnerabilities is a trend to explore possible damages and casualties by scenarios or real situations. For decision making process, knowing social impacts is helpful to allocate resources in advance. Why impact assessment is important? Due to limitation of sciences, uncertainties and deviation always exist in all numerical results. With examining social impacts, it will directly benefit in prioritizing possibly affected areas and risk communication with the general public.
Smart devices not only help bring daily convenience, but also form new channel to collect situations during a disaster event. Pictures, texts or videos are effective ways for citizens to report what is happening. Because most information transmitted by smart devices is marked with GPS tags that provides vast inputs to deliver a better-understood situation map. On the other hand, it needs attention to filter out "noise" from the inputs including duplications, rumors and fake news. Now scientific community is developing methodology to distinguish facts from noise with assistance of artificial intelligence. Before a mature system is delivered to screen out misleading messages, special care is needed to process massive inputs from the Internet.

Conclusions
By using big data and open data, it does offer new approaches to understand multifaceted characteristics of natural disasters. More than traditional measurements on environmental changes, inputs from real-time and social-dynamic data expend the scope of emergency responders, researchers and decision makers to explore disaster impacts. To make better use of two different cetology data, special attention will help to produce useful information. Individual user groups, researchers, decision makers, disaster managers, emergency responders and citizens should build up constant dialogues to make better preparation through multi-lateral understanding.