Global Problems of the Environment and Nature Management - Environmental Analytical Monitoring of the Environment.

Monitoring is carried out by regional committees for hydrometeorological services through a network of special stations that conduct ground-based meteorological, hydrological, marine observations, etc. At present, UNEP operates 344 water monitoring stations worldwide, located in 59 countries.

Environmental monitoring in Moscow includes constant monitoring of the content of carbon monoxide, hydrocarbons, sulfur dioxide, nitrogen oxides, ozone and dust; observations of the atmosphere of the city are carried out at 30 stationary installations operating in automatic mode. From the information processing center, data on exceeding the maximum permissible concentration are sent to the Moscow Committee for Environmental Protection and at the same time to the government of the capital. Industrial emissions from large enterprises are also automatically controlled, as well as the level of water pollution in the Moskva River. Based on the monitoring data, the main sources of pollution are determined. On fig. 2 is a block diagram of monitoring classification.

Rice. 2. Block diagram of monitoring classification

For example, the purpose of biological monitoring is to determine the resistance of natural ecosystems to external influences. Its main method is bioindication (biotesting), which is the detection and determination of anthropogenic loads by the reactions of living organisms and their communities to them. So, radioactive pollution can be determined by the state of coniferous trees, industrial pollution can be determined by the behavior of many representatives of the soil fauna, and air pollution is very sensitively perceived by mosses and lichens. So, if lichens disappear on tree trunks in the forest, then sulfur dioxide is present in the air. The color of lichens (this method is called lichen indication) also judges the presence of some heavy metals in the soil, for example, copper. Bioindication allows timely detection of a not yet dangerous level of pollution and taking measures to restore the ecological balance of the environment.

According to the scale of generalization of information, monitoring is distinguished:

global - tracking world processes and phenomena in the biosphere with the help of space, aviation technology and a PC and making a forecast of possible changes on Earth. A special case is national monitoring, including similar activities carried out in the territory of a particular country;

regional covers individual regions;

impact carried out in especially hazardous areas directly adjacent to sources of pollution, for example, in the area of ​​an industrial enterprise.

Ecological and analytical monitoring of the environment.

Ecological and analytical monitoring- monitoring the content of pollutants in water, air and soil using physical, chemical and physico-chemical methods of analysis - allows you to detect the entry of pollutants into the environment, establish the influence of anthropogenic factors against the background of natural ones and optimize the interaction of man with nature. So, soil monitoring provides for the determination of acidity, salinity of soils and loss of humus.

Chemical monitoring - part of the environmental analytical, it is a system of observations of the chemical composition of the atmosphere, precipitation, surface and groundwater, waters of oceans and seas, soils, bottom sediments, vegetation, animals and control over the dynamics of the spread of chemical pollutants. Its task is to determine the actual level of environmental pollution with highly toxic ingredients; purpose - scientific and technical support of the system of observations and forecasts; identification of sources and factors of pollution, as well as the degree of their impact; monitoring the established sources of pollutants entering the natural environment and the level of its pollution; assessment of actual pollution natural environment; forecast for environmental pollution and ways to improve the situation.

Such a system is based on sectoral and regional data and includes elements of these subsystems; it can cover both local areas within one state (national monitoring), so Earth generally (global monitoring).

Ecological and analytical monitoring of pollution as part of the Unified State System of Environmental Monitoring. In order to radically increase the efficiency of work to preserve and improve the state of the environment, to ensure environmental safety, on November 24, 1993, a government decree was adopted Russian Federation No. 1229 “On the Creation of the Unified State System of Environmental Monitoring” (EGSEM). The organization of work on the creation of the USSEM provides for the inclusion in the scope of observations of new types and types of pollutants and the identification of their impact on the environment; expansion of the geography of environmental monitoring through new territories and sources of pollution.

The main tasks of the EGSEM:

– development of programs for monitoring the state of the natural environment on the territory of Russia, in its individual regions and districts;

- organization of observations and measurements of indicators of environmental monitoring objects;

– reliability and comparability of observational data both in individual regions and districts, and throughout Russia;

– collection and processing of observational data;

– storage of observation data, creation of special data banks characterizing the ecological situation on the territory of Russia and in its individual regions;

– harmonization of banks and bases environmental information with international environmental information systems;

– assessment and forecast of the state of environmental objects and anthropogenic impacts on them, natural resources, responses of ecosystems and public health to changes in the state of the human environment;

– holding operational control and precision measurements of radioactive and chemical contamination as a result of accidents and catastrophes, as well as forecasting the environmental situation and assessing the damage caused to the natural environment;

– availability of integrated environmental information to a wide range of consumers, social movements and organizations;

– informing the authorities about the state of the environment and natural resources, environmental safety;

– development and implementation of a unified scientific and technical policy in the field of environmental monitoring.

USSEM provides for the creation of two interconnected blocks: monitoring of pollution of ecosystems and monitoring of the environmental consequences of such pollution. In addition, it should provide information on the initial (basic) state of the biosphere, as well as the identification of anthropogenic changes against the background of natural variability.

At present, observations of the levels of pollution of the atmosphere, soil, water and bottom sediments of rivers, lakes, reservoirs and seas in terms of physical, chemical and hydrobiological (for water bodies) indicators are carried out by Roshydromet services. Monitoring of sources of anthropogenic impact on the natural environment and the zones of their direct impact on the animal and plant world, terrestrial fauna and flora (except for forests) is carried out by the relevant services of the Ministry of Natural Resources. Monitoring of lands, geological environment and groundwater is carried out by subdivisions of the Committee of the Russian Federation on Land Resources and Land Management and the Committee of the Russian Federation on Geology and Subsoil Use.

In 2000, the Roshydromet system operated 150 chemical laboratories, 41 cluster laboratories for the analysis of air samples in 89 cities with non-laboratory control. Atmospheric pollution observations were carried out at 682 stationary posts in 248 cities and towns of the Russian Federation, and the soil on agricultural land was not left without attention.

Surface waters of land are monitored at 1175 streams and 151 reservoirs. Sampling is carried out at 1892 points (2604 sites). In 2000, 30,000 water samples were analyzed for 113 indicators. Observation points for pollution of the marine environment exist on 11 seas washing the territory of the Russian Federation. In the Roshydromet system, more than 3,000 samples are analyzed annually for 12 indicators.

The network of monitoring stations for transboundary transport of pollutants is focused on the western border of Russia. At present, the Pushkinskie Gory and Pinega stations operate here, which carry out sampling of atmospheric aerosols, gases and precipitation.

Control of chemical composition and acidity of atmospheric precipitation is carried out at 147 stations of the federal and regional levels. In most samples, only the pH value is measured on-line. When monitoring snow cover pollution, ammonium ions, sulfation, benzo(a)pyrene and heavy metals.

The system of global atmospheric background monitoring includes three types of stations: basic, regional and regional with an extended program.

Six integrated background monitoring stations have also been set up, located in biosphere reserves: Barguzinsky, Central Forest, Voronezh, Prioksko-Terrasny, Astrakhan and Caucasian.

For radiation monitoring on the territory of the country, especially in areas contaminated as a result of the Chernobyl accident and other radiation disasters, a fixed network and mobile devices are used. According to a special program, an aerial gamma survey of the territory of the Russian Federation is also carried out.

Within the framework of the USSEM, a system is being created for the rapid detection of pollution associated with emergency situations.

Ecological and analytical monitoring of pollution as part of the USSEM can be divided into three major blocks: pollution control in areas of significant anthropogenic impact, at the regional level, at the background level.

All data from zones with any level of impact, both emergency and generalized, at certain intervals are sent to the center for collecting and processing information. For an automated system that is currently being developed, the primary stage is a local system serving a separate area or city.

Information from mobile stations and stationary laboratories on environmental pollution with dioxins and related compounds is processed, sorted and transmitted to the next level - to regional information centers. Further, the data is sent to interested organizations. The third level of the system is the main data center, which summarizes information on environmental pollution on a national scale.

The efficiency of automated systems for processing environmental and analytical information is noticeably growing when using automatic stations control of water and air pollution. Local automated air pollution control systems have been created in Moscow, St. Petersburg, Chelyabinsk, Nizhny Novgorod, Sterlitamak, Ufa and other cities. Experimental tests of stations for automated control of water quality in places of water discharge and water intakes are being carried out. Devices for continuous determination oxides of nitrogen, sulfur and carbon, ozone, ammonia, chlorine and volatile hydrocarbons. At automated water pollution control stations, temperature, pH, electrical conductivity, oxygen content, chloride ions, fluorine, copper, nitrates, etc. are measured.

The information-analytical system of the CPS "Monitoring-Analysis" allows you to control the process of customs clearance in the field of nomenclature, cost, weight of goods cleared, and calculation of customs duties.

"Monitoring-Analysis" implements the integration process for various information sources (GTE DB, TP NSI DB, USRLE DB, EGRN DB) and subsequently used the accumulated (aggregated) data to generate reports and certificates of various forms.

"Monitoring-Analysis" performs the following functions:

– providing access to the CBD of the CCD, as well as to the CBD of customs receipt orders (TPO);

- providing the possibility of creating and editing conditions that limit the selection of data from the GTD CDB;

– visual display and printing of report information;

– correction of received reports in Microsoft Excel.

Information on the activities of customs authorities in the field of customs clearance of gas turbine engines is presented in Monitoring-Analysis according to various criteria, including:

– cost, weight and nomenclature of processed goods;

- accrued payments;

– the country of origin and the country of destination of the transported goods;

– participants in customs clearance (customs authorities, customs inspectors, participants in foreign economic activity);

– the dynamics of customs clearance processes.

"Monitoring-Analysis" makes it possible to receive both general data on the customs clearance of goods, and detailed information on each of the participants in foreign economic activity, a specific warehouse and a customs inspector.

Additionally, "Monitoring-Analysis" provides access (analysis and control) to the processes of delivery of goods under customs control.

Monitoring-Analysis” has a pronounced three-level structure. The user (via Internet Explorer) sends a request to the WWW server. The WWW server sends the request to the ORACLE DBMS. The DBMS processes the request and returns it to the WWW server.

The WWW server, in turn, converts the received data into an HTML page and returns the result to the user. Therefore, all updates software CPS "Monitoring-Analysis" occur on the WWW-server and in the ORACLE DBMS. Changes in the software accordingly become available to the user.

- TPO CBD - monitoring the processes of customs clearance of solid waste for TPO CBD;

- CBD DKD - monitoring the processes of delivery of goods by customs control (access to the database "Delivery-CBD");

– Search in USRN, USRLE - search for information about legal entities - participants in customs clearance processes.

3. General information about AS adppr "Analytics-2000

The UAIS database of the Federal Customs Service of Russia stores and processes huge amounts of information on various aspects of customs activity, including electronic copies of customs cargo declarations (CCD) and customs receipt orders (registered by Russian customs since 1991). The growth rate of the database volume is on average 600 thousand records per quarter (about 2.5 million per year). This array of data contains the most valuable information about Russia's foreign economic activity.

Significant volumes of information about Russia's foreign economic activity require effective processing tools to provide decision support processes for the management of customs activities.

The first step in creating a full-scale enterprise-level decision support system (DSS) was the processing of a system for online multivariate data analysis of electronic copies of customs documents, which provides a new level of data analysis and performance indicators that are incomparable compared to statistical analysis.

System goals of creating the "Analytics-2000" system:

– reduction of time and labor costs required to obtain aggregated information;

- increasing the productivity of employees of the Federal Customs Service;

– improving the quality of analytical data issued at the request of higher organizations;

- enabling senior and middle managers, as well as analysts, to navigate huge amounts of data and select the information necessary for decision-making;

– provision of graphical representation of data.

The information and analytical monitoring unit performs its main function, since in order to make reasonable management decisions it is important for the relevant authorities to analyze and assess the state of the facility and the dynamics of its performance indicators. Effective information and analytical support for solving the necessary tasks can be provided by systems for automating the analytical activities of specialists in government bodies, organizing the processes of collecting, storing and processing information. The concept of such systems for a wide class of managed objects should be based on modern technology of integrated data storage and in-depth analytical processing of accumulated information based on modern information technologies.

As already noted, the traditional and generally accepted sources of primary information are statistical reporting, accounting and management accounting, financial reporting, questionnaires, interviews, surveys, etc.

The stage of analytical and statistical processing of structured primary information is also a few traditional generally accepted approaches. The emergence of these approaches and their system integration were due to the objective need to automate accounting and statistical work in order to reflect the processes occurring in the analyzed subject area as accurately, qualitatively and in a timely manner as possible, as well as to identify their characteristic trends.

The automation of statistical work was reflected in the creation and operation of automated statistical information systems: in the 1970s - the automated system of state statistics (ASDS), and since 1988 - in the design of a unified statistical information system (ESIS). The main objective of these developments was the collection and processing of accounting and statistical information necessary for planning and management national economy on the base wide application economic and statistical methods, computer and organizational equipment, communication systems in state statistics bodies.

In the structural-territorial aspect, ASDS was strictly hierarchical, had four levels: union, republican, regional, district (city). At each level of information processing, it was carried out in order to implement tasks, first of all, at this level.

In the functional aspect, functional and support subsystems are distinguished in ASDS. These subsystems, regardless of the content of specific statistical tasks, implemented the functions of collecting and processing statistical information, complex statistical analysis, monitoring the performance of indicators, obtaining statistical data necessary for the current and operational planning, timely submission to the governing bodies of all the necessary statistical data. From the user's point of view, monitoring tasks are divided into:

routine tasks related to the processing of statistical reporting data at the relevant structural and territorial levels of ASDS;

tasks of information and reference services; tasks of in-depth economic analysis.

Regulatory tasks associated with the processing of statistical reporting data at the ASDS levels. Each regulatory task, as a rule, is associated with the processing of data of a specific form of statistical reporting or several forms of reporting that are closely related in meaning. The solution of such problems is carried out by complexes of electronic information processing, which are a set of software, technical and organizational tools using local information arrays.

The tasks of information and reference services provide for the formation, upon request, of the necessary statistical data for the prompt preparation of reports, analytical notes and references, are not regulated in content. their solutions are provided with the help of an automated data bank in the form of a system for accumulating, storing, searching, processing and issuing information at the request of users in the right form.

The tasks of in-depth economic analysis are based on the use of:

dynamic series (building polygons, histograms of frequencies and cumulative lines, selection of trends from a selected class of functions);

smoothing the initial time series, diagnostics based on the selected trend and autoregressive model, analysis of residuals for autocorrelation and normality)

pair regression (determination of linear and non-linear regression equations, evaluation of their statistical characteristics, selection of the optimal form of connection);

multiple regression (determination of a matrix of paired correlation coefficients, determination of multiple linear regression equations),

factor analysis (obtaining a linear model, described by a small number of factors, calculation of the values ​​of "loads on common factors" and the most common factors, graphical interpretation of factors on the plane and in space);

correlation analysis (obtaining correlation matrices, mean and standard deviations).

The organizational and technological form of solving this class of problems is analytical complexes, which are a set of application software packages focused on the implementation of mathematical and statistical methods. To cover wide time ranges of analyzed data, a register form of monitoring is used based on automated registers, which allow saving and processing significant sets of data organized

in the form of arrays independent of the structure of statistical reports for each object or a specific group of monitoring objects. The register form of monitoring is especially effective for statistical information characterizing relatively stable objects; therefore, registers can be considered as an automated file of groups of homogeneous units of statistical observation of a certain type. Its application enables the user, by filling out a unified request form, to receive various data characterizing the state of an object.

An important direction in improving statistical monitoring was to increase the content, reliability and efficiency of reporting data based on a combination of current reporting, one-time records, selective and monographic surveys, as well as optimizing information flows. Particular emphasis is placed on the improvement of economic and mathematical methods for analyzing and forecasting the development of systems. In addition, a significant progress in the evolution of monitoring methods was the use of new information technologies, namely:

development of a complex information processing technology using data banks and computer networks;

creation of means of computer modeling of data processing systems;

development of intellectualized types of end user interface with a computer based on automated workstations that involve the use of expert systems.

New information technologies have significantly expanded the possibility of direct automated access to the necessary statistical information, diversified the composition and content of analytical work. It became possible to integrate one statistical information monitoring system with other information systems of all levels of telecommunications channels management.

However, all considered methods of analytical and statistical data processing have a significant drawback. The entire set of data is processed in them as a disparate set, which is why there is no system unity. Between one or another information flow, only an artificial connection can be established by combining them into a specific reporting form. However, it is impossible to foresee all forms for all possible phenomena and connections. Traditional Methods analytical and statistical data processing do not take into account the fact that there is a natural connection between any kind of phenomena and events, based on universal indicators inherent in all of them. In the presence of a system of such natural

connections, it becomes possible to compare with the phenomenon under consideration all the factors, events, data associated with it in an explicit or implicit form. Monitoring based on this approach is characterized by completeness of coverage of cause-and-effect relationships of factors of mutual influence of latent trends. All this is considered in an inseparable systemic unity.

This shortcoming can be eliminated thanks to the recently very common approach to the problem of analytical and statistical data processing based on the latest technology OLAP - Online Analytical Processing (operational data analysis).

The OLAP term refers to methods that enable database users to generate descriptive and comparative information about data in real time and get answers to various analytical queries. The defining principles of the OLAP concept include:

multidimensional conceptual representation - OLAP databases must support multidimensional representation of data, provides for the classic operations of splitting and rotating a conceptual data cube;

transparency - users do not need to know that they are using an OLAP database. They can use the tools they are familiar with to get the data and make the right decisions. they also don't need to know anything about the source of the data;

availability - software tools must choose and communicate with the best data source to form a response to a given request. They should provide automatic mapping of their own logical schema to various heterogeneous data sources;

consistent performance - performance should be practically independent of the number of dimensions in the query. System models must be powerful enough to handle all the changes to the model in question;

support for client-server architecture - OLAP tools must be able to work in a client-server environment, since it is assumed that the multidimensional database server must be accessible from other programs and tools;

equality of all dimensions - each data dimension must be equivalent both in structure and in operational capabilities. The underlying data structure, formulas, and reporting formats should not focus on any one data dimension;

dynamic processing of sparse matrices - typical multidimensional models can easily access large sets

cell references, many of which do not have data at any given moment. These missing values ​​must be stored in an efficient manner and not adversely affect the accuracy or speed of information retrieval;

support for multiple quizzes - OLAP tools should support and encourage group work and the sharing of ideas and analysis between users. To do this, it is very important to have multi-user access to data;

support for operations between different dimensions. All multidimensional operations (for example, aggregation) must be defined and made available in such a way that they are performed in a uniform and consistent way, regardless of the number of dimensions;

intuitive data management - the data provided to the user-analyst should contain all the information necessary for effective navigation (formation of slices, changes in the level of detail of information presentation) and the execution of relevant queries;

flexible reporting - the user has the ability to extract any data he needs and form them in any form he needs;

unlimited dimensions and aggregation levels - there should be no limit on the number of dimensions supported.

The use of systems based on OLAP technology makes it possible to:

organize a single information repository based on statistical and other reporting data;

provide simple and efficient access to storage information with access rights differentiation

provide the possibility of operational analytical processing of stored data, statistical analysis;

streamline, standardize and automate form creation analytical reports with displaying data in a given form.

Home distinctive feature and an important advantage of multidimensional data presentation compared to traditional information techniques is the possibility of joint analysis large groups parameters in mutual connection, which is important in the study of complex phenomena.

OLAP technology significantly reduces the time of collecting and analyzing primary information necessary for making decisions in a particular area of ​​human activity, and also increases the visibility and information content of reports on processes and phenomena occurring in these areas.

OLAP systems allow you to accumulate large amounts of data collected from various sources. Such information is usually

Before creating such a system, three main questions should be considered and clarified:

collect data and how to conceptually model data and manage its storage; how to analyze data;

how to efficiently load data from multiple independent sources.

These issues can be correlated with the three main components of a decision support system: the data warehouse server, online analytical data processing tools, and data warehouse replenishment tools.

Since the organization of information warehouses is the subject of other disciplines, we will only consider the issue of analytical data processing. There are currently a number of OLAP tools that can be used to analyze information. These are such software products as MicroStrategi 7 and WebIntelligence, Cognos Powerplay, AlphaBlox, etc. We will review these products based on the following criteria:

ease of use - the software product should be simple enough for a user who does not have special training;

interactivity - the software tool has to implement interactive features, including: viewing documents, dynamically updating existing documents, providing access to the latest information, dynamically executing queries on data sources, dynamic unlimited "digging into the data";

functionality - the application must provide the same capabilities as traditional client / server counterparts;

accessibility - information should be accessible for any device and workplace, and the client part should be small in order to satisfy various levels of user network bandwidth and meet standardized technology;

architecture - this criterion characterizes aspects of the software implementation of the product;

independence from data sources - the application must provide access to documents of any type and provide interactive access to relational and multidimensional databases,

performance and scalability - to ensure the performance and scalability of the application, it is necessary to implement universal access to databases, the ability to cache data by the server, and the like;

security - aspects of application administration to provide different access rights to different categories of users;

cost of implementation and administration - the cost of implementing an OLAP product per user should be significantly lower than for traditional products.

MicroStrategi 7 and:-a set of software products with a wide range of functions, built on a unified server architecture. The user environment is implemented in Misgo-Strategi Web Professional.

Users are offered a range of statistical, financial and mathematical functions for complex OLAP and relational analysis. All users have access to both aggregated and detailed information (at the transaction level). You can perform new calculations, filter report data, rotate and add intermediate totals, and quickly change report content.

The main functionality is achieved through the following means:

MicroStrategi 7 and OLAP Services - interface to third party products;

Intelligent Cube technology - simplifies analysis and deployment by providing summary information for quick online viewing;

MicroStrategi Narrowcaster - Allows users to send metrics or pay for them via a web interface. Users can email their reports, schedule report forwarding, publish to workgroups, and export to Excel, PDF, or HTML formats.

This product provides cross-platform support and integration, portability to Unix, support for third-party application servers.

The product is based on XML architecture. Users can integrate the XML generated by MicroStrategi Web into their applications or format it however they want.

The thin client, implemented in HTML format, eliminates browser compatibility issues, deploys through all firewalls. The appearance and functions of the program can be customized for specific needs. You can embed MicroStrategi Web into other applications running on the network.

Computers running MicroStrategi Web can be clustered, providing scalability and reliability. Addition provided additional equipment. if

If a task fails, it is transferred to another computer from the same cluster.

Data is protected at the cell level using security filters and access control lists. Web traffic security is ensured by data encryption technology at the transport level - SSL (Secire SocxeT Level - the level of secure sockets).

webintelligence-Web product for creating queries, reports and data analysis. Provides network users (both Intranet and Extranet) secure access to data for further exploration and management. It makes analytical capabilities available to various categories of users. A wide range of business intelligence tools are provided, including complex reporting, calculation, filtering, drill down, and aggregation.

WebIntelligence provides the following features:

formatting and printing reports in visual design mode;

bagatoblock reports. In complex reports, in order to convey comprehensive information, it is sometimes necessary to place several tables or charts at once. To do this, WebIntelligence provides the ability to add multiple blocks and diagrams to one report;

the possibility of detailing data in an interactive mode.

The product provides a number of functions:

access to data stored both in traditional relational databases and on an OLAP server;

data analysis functions;

ability to share information. WebIntelligence is a "thin" client that does not require installation and maintenance of application software or database middleware on the client side. When installing the client part, it is possible to select a technology. Deployment on Microsoft Windows and Unix platforms is provided.

With WebIntelligence, you can explore and analyze various OLAP data sources and share OLAP and relational data.

The product is customized to best fit corporate structure any object.

WebIntelligence can run on a single server or on multiple NT or Unix machines. Servers can be added to the system as needed, if a failure occurs on one of the components, the other is automatically used. Weighted load balancing across multiple servers optimizes system resources and guarantees fast response times.

WebIntelligence uses various information security technologies. Where appropriate, components are identified using digital certificate technology. Hypertext transfer protocol is used to work with various network protection systems.

The application has a standard web interface. Basic features are supported (retrieving data with specified dimensions and values, "drilling" into data, nested crosstabs, calculations, enabling / disabling the display of rows, columns and graphs; filters, sorting) for viewing, exploring, reporting and publishing OLAP data in interactive mode.

Cognos Powerplay provides the following functionality: HTML/JavaScript application that provides universal access for a user running Netscape Navigator version 3.0 or higher or Microsoft Internet Explorer;

access to OLAP data of any user of the object; creating and publishing BPM (Business Performance Management) reports as PDF documents for the Cognos Upfront portal, giving users access to the most important corporate data on the Web;

converting data from PDF format to dynamic reports, their further research and transfer of results to Upfront;

the server supports platforms: Windows NT, Windows 2000 and higher, SUN Solaris, HP / UX, IBM AIX.

Thanks to the support of the SSL protocol, PoverPlay guarantees the security of data sent via the Web. In addition, by setting user classes, system administrators can control their access both to local cubes and in the Web portal shell. These classes are stored in a special LDAP (Light Directory Access Protocol) software component that is responsible for centralized management the security of the entire system, as well as for integration with the current protection.

The use of HTML to implement client locations ensures that the PoverPlay server operates in a secure environment. This ensures secure deployment of applications for customers, partners and vendors.

AlphaBlox- middleware that provides tools and building blocks for working on the Web. This eliminates the complexities associated with securing network connections to databases, authorizing and formatting data. The AlphaBlox analytical platform is implemented on the basis of a standardized I2EE-compatible architecture.

AlphaBlox products are designed to perform analytical calculations inside and outside the facility.

Of particular interest are Java Components (Biox). From these components, you can create an analytical Web application. One of the tedious tasks in creating an OLAP Web product is displaying and formatting data in the browser. Very often, data needs to be shown as a table or charts. When creating a program using AlphaBlox, you can insert any number of these Java components into it and customize them to perform the desired tasks by setting certain parameters of the applets, thereby controlling the appearance and functions of the components. This software product provides the following features: access to information - data is retrieved from various relational and multidimensional databases;

queries and analysis - components perform simple and complex queries against various data sources, without the need for CQL programming;

presentation - the ability to present data in various formats (in the form of reports, tables, charts).

Java components are modular and reusable. they can be used to implement analytics capabilities for a variety of business functions. Because they are controlled by a set of options, their properties can be modified using a text editor. This provides flexibility in developing and upgrading the analytical solution. Components can be customized to meet specific business requirements and reused to deploy additional applications in other areas of business. Application developers can write additional code in JSP, JavaServlets, or JavaScript.

AlphaBlox solutions use the services provided by the application server and the Java Runtime Environment (JRE), any Java extensions or custom extensions developed for this platform.

AlphaBlox's application framework is standards-based and allows integration with existing operating systems, transactional infrastructure, and legacy systems. Provides user access to data from various sources and their subsequent analysis.

AlphaBlox uses standard application server resources and capabilities, including http processing/caching and memory/process management, as well as integration with Web servers. In addition, the 12EE-compatible architecture eliminates unnecessary page updates and allows the core logic to run on the server.

AlphaBlox uses the same security model and application server implemented using standard J2EE platform features. This eliminates the need to create an independent model of the protection mechanism.

Ease of deployment is one of the main benefits of a Web application. This fully applies to AlphaBlox applications. However, they require certain versions of browsers and the Java platform, while the HTML thin client works in most browsers.

Operational data analysis powered by OLAP technology allows analysts, managers and executives to understand the data using fixed, shared, interactive access to a wide variety of possible data formats that have been obtained from raw data to reflect the real position of the object in a way understandable to users. OLAP functionality is characterized by dynamic multidimensional analysis of object summary data necessary to support the end user with analytical actions, including calculus and modeling applied to data by analyzing a trend over successive time intervals, slicing a multiplier of data for viewing on a screen, changing the level of detail of information presentation in more deep levels of generalization and the like.

OLAP tools are focused on providing multidimensional information analysis. To achieve this, multidimensional data storage and representation models are used. The data is organized in cubes (or hypercubes) defined in a multidimensional space, made up of individual dimensions. Each dimension includes many levels of detail. Typical OLAP operations include changing the level of detail in the presentation of information (moving up and down the hierarchy of dimensions), selecting certain parts of the cube, and reorienting the multidimensional data view on the screen (getting a pivot table).

For OLAP databases, the ARV-1 benchmark has been developed. This test simulates a real situation for OLAP server software. The standard defines a set of dimensions that define the logical structure. The logical structure of the database consists of six dimensions: time, scenario, measure, product, customer, and channel. The benchmark does not provide a specific physical model: input data is provided in ASCII file format. The test operations carefully model standard OLAP operations on large amounts of data that are sequentially loaded from internal or external sources. These operations include information aggregation, data hierarchical drill down, new data calculations based on business models, and the like.

The possibilities of OLAP technology are considered as the basis for the organization and multidimensional analysis of monitoring information. Let's look at the steps in this process.

Before information can be loaded into a multidimensional monitoring database (MDB), it must be extracted from various sources, cleaned, transformed, and consolidated (Figure 1.3). In the future, this information should be periodically updated.

Rice. 1.3.

Data extraction is the process of extracting data from operational databases and other sources. An analysis of the available sources of information shows that most of them are presented in the form of tabular data received either in electronic or printed form. Modern means of scanning and image recognition make it possible to automate this stage of data preparation almost completely.

Before entering information into the database, it is necessary to clean it up. Typically, cleanup involves filling in missing values, correcting typos and other data entry errors, defining standard abbreviations and formats, replacing synonyms with standard identifiers, and the like. Data that is determined to be false and cannot be corrected is discarded.

After cleaning the data, it is necessary to convert all the received information into a format that will meet the requirements of the software product (OLAP server) used. The conversion procedure becomes especially important when it is necessary to combine data from several different sources. This process is called consolidation.

The stage of loading information into the BDB consists in creating the necessary data structure and filling in its information obtained at the previous stages of data preparation.

Extracting information from the BDB allows you to implement Microsoft SQL Server Analysis Services, which is both a provider of both multidimensional data (multidimensional data provider) and tabular data (tabular data provider). Thus, executing a query returns either a multidimensional data set or a regular table, depending on the query language used. Analysis Services supports both SQL and MDX (multidimensional expressions) extensions.

SQL queries can be passed to Analysis Services using the following data accessors:

Microsoft OLE DB and OLE DB for OLAP;

Microsoft ActiveX Data Objects (ADO) and ActiveX Data Objects Multidimensional (ADO MD).

OLE DB for OLAP extends the capabilities of OLE DB to include objects specific to multidimensional data. ADO MD extends ADO in a similar way.

Microsoft SQL Server Analysis Services allows you to fill with MDX extensions, which provide a rich and powerful query syntax for working with multidimensional data stored by the OLAP server in cubes. Analysis Services supports MDX functions for defining calculated fields, building local data cubes, and running queries using the pivot tables(Pilot Table Services).

It is possible to create custom functions that work with multidimensional data. Interaction with them (passing arguments and returning a result) occurs using the MDX syntax.

Analysis Services provides over 100 built-in MDX functions for defining complex calculated fields. These functions fall into the following categories: working with arrays; work with measurements; working with hierarchies; work with levels of hierarchies; logic functions; work with objects; numerical functions; work with sets; work with strings; work with tuples.

It is possible to create local cubes intended for viewing on computers where the OLAP server is installed. Creating local cubes requires the use of MDX syntax and goes through the Pilot Table Services component, which is the OLE DB client of the OLAP server. This component also does offline work with local cubes when not connected to the OLAP server by providing an OLE DB data source interface. Local cubes are created using the CREATE CUBE and INSERT INTO statements.

The MDX query language, which is an extension of SQL, allows you to query cubes of data and return the result as multidimensional datasets.

Just like in regular SQL, the creator of an MDX query must first determine the structure of the dataset being returned. In most cases, the creator of an MDX query thinks of the returned dataset as multidimensional structures. Unlike a regular SQL query, which operates on tables to produce a two-dimensional record set, an MDX query operates on cubes to produce a multi-dimensional result set. It should be noted that an MDX query can also return two-dimensional datasets, which are a special case of a multidimensional dataset.

Visualizing multidimensional datasets can be quite difficult. One visualization technique is to constrain the feed to a flat, two-dimensional table by using many nested dimensions along a single axis. This nesting will result in subheadings.

Pilot Table Services, part of Microsoft SQL Server Analysis Services, is an OLAP server designed to access OLAP data. This component functions as an Analysis Services client.

Pilot Table Services features include data analysis, cube building, and optimal memory management. The component provides an interface to multidimensional data. It is possible to save data in a local cube on the client's computer and then analyze it without connecting to an OLAP server. Pilot Table Services is needed to perform the following tasks:

establishing a connection with the OLAP server as a client component;

providing programs with an OLE DB interface with OLAP extensions;

functioning as a tabular data source, supports a subset of SQL;

functioning as a multidimensional data source, supports MDX extensions;

creating a local data cube;

functioning as a mobile desktop OLAP client.

The PivotTable component can only work with one local cube partition. Also, it does not have a built-in system for managing the levels of information provision. Therefore, the performance of Pilot Table Services is directly proportional to the amount of data it addresses.

It should be noted that the OLAP interface is simple and requires no more knowledge than a spreadsheet. OLAP allows you to use various forms of reports, an interface for interactive data analysis and the ability to generate printed forms. However, compared to traditional methods of programming and generating custom reports, OLAP not only reduces programming costs hundreds of times, but also changes the very principle of how a user works with a report.

The difference between OLAP as a reporting tool lies in the ability to automatically and interactively perform such operations with data:

recursive grouping of data; calculation of subtotals for subgroups; calculation of the final results.

The commands to perform these operations are given by the user himself. Sections of the used table act as controls. When the user changes the form of the report (for example, moves bars), the system performs subtotal calculations and displays the new report.

Additionally, the user can change the sorting and filter by arbitrary combinations of data, see the data in percentage terms, change the scale and perform other necessary report transformations (these features are not an indispensable attribute of OLAP technology, but depend on the specific implementation of the tool).

As a result, the user can independently, intuitively understandable to him from the existing data set, generate all possible types of reports for this set. This helps to overcome the age-old limitation of information systems, which is that the power of interfaces is always lower than the power of the database.

OLAP technology allows you to implement almost all possible types of tabular representation of the contents of the database. If the product is flexible enough, then the programmer's task is to describe the semantic layer (dictionary), after which a qualified user can independently create new cubes, using terms of the subject area known to him. Other users can generate reports for each cube.

Thus, OLAP technology serves both developers and users in all those cases when it is necessary to see information in the form of tabular reports in which data is grouped and totals are calculated for groups.

Experience shows that it is not enough to provide users with a large cube consisting of many dimensions and facts. This is due to the following reasons.

First, at every moment the user needs a well-defined report.

Secondly, some algorithms for calculating the totals are described by complex formulas, and the user may not have sufficient qualifications to determine them.

Thirdly, an OLAP report can have a specific method for calculating totals, the location of dimensions, and initial sorting conditions specified by the report author.

Fourth, in many cases it is easier to understand the data if you look not at a table with numbers, but at a chart. To set up an OLAP diagram, sometimes you need to have a good spatial imagination, since a cube with many dimensions needs to be reflected as a set of shapes or lines in a three-dimensional drawing. The number of properties of modern graphical components is in the thousands, so preconfiguring a chart or graph for an OLAP report can take a long time.

Fifth, as for any other report, for an OLAP report, its effective design is important, including settings for headings and captions, colors and fonts.

Thus, for a comfortable user experience, an OLAP report must contain a certain set of applied metadata that describes aggregation algorithms, preconditions for filtering and sorting, headings and comments, and visual design rules.

When visualizing the information of a multidimensional cube, a significant factor is the ordering of the dimensions according to their similarity. The main idea is that measurements that characterize similar parameters are located side by side. To determine such measurements, various clustering methods are used, in particular, heuristic algorithms can be used.

The described information-analytical technology is not the only possible one. But all of them are the development of Business intelligence (BI), the purpose of which is the collection, systematization, analysis and presentation of information. The choice of a specific information-analytical technology is up to the user, taking into account the features of the object of the subject area.

Page 31 of 45

Ecological and analytical monitoring of the environment.

Ecological and analytical monitoring- monitoring the content of pollutants in water, air and soil using physical, chemical and physico-chemical methods of analysis - allows you to detect the entry of pollutants into the environment, establish the influence of anthropogenic factors against the background of natural ones and optimize the interaction of man with nature. So, soil monitoring provides for the determination of acidity, salinity of soils and loss of humus.

Chemical monitoring - part of the environmental analytical, it is a system for observing the chemical composition of the atmosphere, precipitation, surface and groundwater, ocean and sea waters, soils, bottom sediments, vegetation, animals and monitoring the dynamics of the spread of chemical pollutants. Its task is to determine the actual level of environmental pollution with highly toxic ingredients; purpose - scientific and technical support of the system of observations and forecasts; identification of sources and factors of pollution, as well as the degree of their impact; monitoring the established sources of pollutants entering the natural environment and the level of its pollution; assessment of actual environmental pollution; forecast for environmental pollution and ways to improve the situation.

Such a system is based on sectoral and regional data and includes elements of these subsystems; it can cover both local areas within one state (national monitoring), and the globe as a whole (global monitoring).

Ecological and analytical monitoring of pollution as part of the Unified State System of Environmental Monitoring. In order to radically increase the efficiency of work to preserve and improve the state of the habitat, ensure environmental safety, on November 24, 1993, the Decree of the Government of the Russian Federation No. 1229 “On the Creation of the Unified State System of Environmental Monitoring” (EGSEM) was adopted. The organization of work on the creation of the USSEM provides for the inclusion in the scope of observations of new types and types of pollutants and the identification of their impact on the environment; expansion of the geography of environmental monitoring through new territories and sources of pollution.

The main tasks of the EGSEM:

– development of programs for monitoring the state of the natural environment on the territory of Russia, in its individual regions and districts;

- organization of observations and measurements of indicators of environmental monitoring objects;

– reliability and comparability of observational data both in individual regions and districts, and throughout Russia;

– collection and processing of observational data;

– storage of observation data, creation of special data banks characterizing the ecological situation on the territory of Russia and in its individual regions;

– harmonization of banks and databases of environmental information with international environmental information systems;

- assessment and forecast of the state of environmental objects and anthropogenic impacts on them, natural resources, responses of ecosystems and public health to changes in the state of the human environment;

– carrying out operational control and precision measurements of radioactive and chemical contamination as a result of accidents and catastrophes, as well as forecasting the environmental situation and assessing the damage caused to the natural environment;

– availability of integrated environmental information to a wide range of consumers, social movements and organizations;

– informing the authorities about the state of the environment and natural resources, environmental safety;

– development and implementation of a unified scientific and technical policy in the field of environmental monitoring.

USSEM provides for the creation of two interconnected blocks: monitoring of pollution of ecosystems and monitoring of the environmental consequences of such pollution. In addition, it should provide information on the initial (basic) state of the biosphere, as well as the identification of anthropogenic changes against the background of natural variability.

At present, observations of the levels of pollution of the atmosphere, soil, water and bottom sediments of rivers, lakes, reservoirs and seas in terms of physical, chemical and hydrobiological (for water bodies) indicators are carried out by Roshydromet services. Monitoring of sources of anthropogenic impact on the natural environment and the zones of their direct impact on the animal and plant world, terrestrial fauna and flora (except for forests) is carried out by the relevant services of the Ministry of Natural Resources. Monitoring of lands, geological environment and groundwater is carried out by subdivisions of the Committee of the Russian Federation on Land Resources and Land Management and the Committee of the Russian Federation on Geology and Subsoil Use.

In 2000, the Roshydromet system operated 150 chemical laboratories, 41 cluster laboratories for the analysis of air samples in 89 cities with non-laboratory control. Atmospheric pollution observations were carried out at 682 stationary posts in 248 cities and towns of the Russian Federation, and the soil on agricultural land was not left without attention.

Surface waters of land are monitored at 1175 streams and 151 reservoirs. Sampling is carried out at 1892 points (2604 sites). In 2000, 30,000 water samples were analyzed for 113 indicators. Observation points for pollution of the marine environment exist on 11 seas washing the territory of the Russian Federation. In the Roshydromet system, more than 3,000 samples are analyzed annually for 12 indicators.

The network of monitoring stations for transboundary transport of pollutants is focused on the western border of Russia. At present, the Pushkinskie Gory and Pinega stations operate here, which carry out sampling of atmospheric aerosols, gases and precipitation.

Control over the chemical composition and acidity of atmospheric precipitation is carried out at 147 stations of the federal and regional levels. In most samples, only the pH value is measured on-line. When monitoring snow cover pollution, ammonium ions, sulfation, benzo(a)pyrene, and heavy metals are also determined in the samples.

The system of global atmospheric background monitoring includes three types of stations: basic, regional and regional with an extended program.

Six stations for complex background monitoring have also been created, which are located in the biosphere reserves: Barguzinsky, Central-Lesnoy, Voronezhsky, Prioksko-Terrasny, Astrakhansky and Caucasian.

For radiation monitoring on the territory of the country, especially in areas contaminated as a result of the Chernobyl accident and other radiation disasters, a fixed network and mobile devices are used. According to a special program, an aerial gamma survey of the territory of the Russian Federation is also carried out.

Within the framework of the USSEM, a system is being created for the rapid detection of pollution associated with emergency situations.

Ecological and analytical monitoring of pollution as part of the USSEM can be divided into three major blocks: pollution control in areas of significant anthropogenic impact, at the regional level, at the background level.

All data from zones with any level of impact, both emergency and generalized, at certain intervals are sent to the center for collecting and processing information. For an automated system that is currently being developed, the primary stage is a local system serving a separate area or city.

Information from mobile stations and stationary laboratories on environmental pollution with dioxins and related compounds is processed, sorted and transmitted to the next level - to regional information centers. Further, the data is sent to interested organizations. The third level of the system is the main data center, which summarizes information on environmental pollution on a national scale.

The efficiency of automated systems for processing environmental and analytical information is noticeably growing when using automatic stations for monitoring water and air pollution. Local automated air pollution control systems have been created in Moscow, St. Petersburg, Chelyabinsk, Nizhny Novgorod, Sterlitamak, Ufa and other cities. Experimental tests of stations for automated control of water quality in places of water discharge and water intakes are being carried out. Instruments have been created for the continuous determination of oxides of nitrogen, sulfur and carbon, ozone, ammonia, chlorine and volatile hydrocarbons. At automated water pollution control stations, temperature, pH, electrical conductivity, oxygen content, chloride ions, fluorine, copper, nitrates, etc. are measured.