What Is an Integrated Database?

Data integration is the logical or physical integration of data from different sources, formats, and characteristics in order to provide enterprises with comprehensive data sharing. In the field of enterprise data integration, there are already many mature frameworks available. Currently, federated, middleware-based models and data warehouses are used to construct integrated systems. These technologies address data sharing and provide decision support for enterprises on different focuses and applications.

In enterprises, due to different development time or development departments, there are often multiple heterogeneous information systems running on different software and hardware platforms running at the same time. The data sources of these systems are independent and closed to each other, making it difficult to store data in The communication, sharing and integration between the systems formed an "information island". With the continuous in-depth application of information technology, the internal and external information interaction between enterprises and enterprises is increasingly demanding. It is urgent to integrate the existing information. China Unicom "
In recent decades, the rapid development of science and technology and the advancement of information technology have caused the amount of data accumulated by human society to exceed the total of the past 5,000 years, and the number of data collection, storage, processing and dissemination has also increased. Enterprises can realize data sharing, which can make more people use existing data resources more fully, reduce data collection,
Data integration model classification
Data integration is the logical or physical integration of data from different sources, formats, and characteristics in order to provide enterprises with comprehensive data sharing. In the field of enterprise data integration, there are already many mature frameworks available. Common methods such as federated, middleware-based models, and data warehouses are used to construct integrated systems. These technologies address data sharing and provide decision-making support for enterprises on different focuses and applications. Here is a basic analysis of these data integration models.
Data cache is key
For data integration architectures, the key is to have a data cache that contains target planning, source-target mapping, data acquisition, hierarchical extraction, error recovery, and security transformation. In addition, the data cache contains pre-customized
When implementing data integration, the most important thing is to ensure that there are corresponding business requirements. The three business scenarios are listed below:
Enterprise groups need unified data
When a business makes a merger, you may need to consolidate all your corporate data and integrate them. Take Hypercity as an example. Hypercity, Shoppers Stop, Crosswords are all retail group companies. We need to integrate all their customer data in order to better serve our customer group.
Facilitate data flow in the system
That is, when you need to integrate multiple data sources and applications to implement a business process; for example, the data flow of some business analysis tools will come from multiple applications-commodity management system or Oracle financial system and so on.
Data integration that may be required when deploying a new application
A new enterprise application requires all the data from the existing application. Let's take Hypercity as an example. When implementing home delivery applications, we need to retrieve customer and product information from existing systems. At this time, data integration becomes very important.
key step:
1. Cooperation with software vendors
As a business, getting the vendors to understand your business needs is good for you. Because only then can executives accurately identify and integrate all the data points you need.
2. Define prioritization of integration
List all required data integration tasks and arrange a deployment plan. Your goal should be to complete all data integration activities before the deployment or formal commissioning phase, and define how often the growth data is updated. In addition, it is necessary to estimate its benefits based on data integration solutions based on cost and time savings.
3. Choose the right integration interface
Data integration solutions provide two data interfaces: one-way and two-way. You need to know which one should be applied.
In the one-way interface, the data is only transferred from point A to point B, and there is no movement of return or back and forth. On our B2B platform, suppliers can track shipping information to stores. Inventory, payment, and sales information are sent to the B2B platform, but no data is returned to these data sources.
In a two-way interface, data is transferred from one application to another and then returned. On our platform, if a new application (such as point of sale, POS) is deployed, product data will be sent from the commodity management system to the POS, and then sales data will be sent back from the POS.
4. Choose the right interface media, the simplest is not necessarily the best
The choice of interface media must consider future needs and upgrade issues. There are many ways to integrate data-
The role of data integration in enterprise information systems
The emergence of data integration enables enterprises to migrate back-end ERP information to the Internet. Data integration products provide "cache" or data staging between a company's Internet computer and back-end systems from companies such as SAP, Oracle, and PeopleSoft.
Data integration provides a mirror image of back-end information stored on a business host computer. When an Internet customer needs to check the status of an order, the query is transferred to data integration software. Therefore, it is not always necessary to access the main computer of the enterprise. The data integration software is smart enough to know when to synchronize with the host computer so that the data is constantly updated. The integration of ERP data for e-commerce applications is accomplished through a combination of data grading and direct access to ERP data, which includes the use of a data server and some data caches. Data integration software intelligently mixes direct real-time and batch data access methods to extract data from an ERP system.
Data moves from one or more sources to one or more target tables and information types (such as XML). The steps of data movement include determining the source from which data should be extracted, the transformation that data should be made to, and where to send the data. The user specifies data mapping and transformation through a graphical user interface.
A user-defined program controls the movement of each piece of data and determines the internal correlation between such movements. For example, if a target table depends on the values of other target tables, use procedures to specify in what order a data server should manage individual data movements in those target tables. Data movement can be designed to run in batch or real-time mode and created and managed by administrators to control data movement between ERP, e-commerce, customer relationship management, supply chain management, and communications applications. Data movement uses distributed query optimization, multithreading, in-memory data conversion, and parallel pipeline operations to provide high data throughput and scalability. For example, to manage extraction programs and perform batch data extraction from SAP software, optimized ABAP code (SAP's proprietary programming language) can be used, eliminating the need to develop and maintain custom ABAP code.
Introduction
Informatica Enterprise Data Integration includes two products, Informatica PowerCenter and Informatica PowerExchange. With its high performance and fully scalable platform, it can solve almost all data integration projects and enterprise integration solutions.
· Informatica PowerCenter is used to access and integrate data of almost any business system and any format. It can deliver data in the enterprise at any speed. It has the characteristics of high performance, high scalability, and high availability. Informatica PowerCenter includes 4 different editions, namely: Standard Edition, Real-Time Edition, Advanced Edition, Cloud Computing Edition. At the same time, it also provides multiple optional components to extend the core data integration functions of Informatica PowerCenter. These components include: data cleaning and matching, data masking, data verification, Teradata dual load, enterprise grid, metadata exchange, Pushdown Optimization, team development, and unstructured data.
· Informatica PowerExchange is a family of data access products that ensures IT organizations can access and deliver critical data throughout the enterprise when and where they need it. With this capability, IT organizations can optimize the business value of limited resources and data. Informatica PowerExchange supports many different data sources and applications, including enterprise applications, databases and data warehouses, mainframes, midrange systems, messaging systems, and technical standards.
Points to note
Data integration is a problem faced by enterprises for further development. Through data model modeling and related application technologies, a certain analysis is made on the application of enterprise information integration. While effectively applying model design ideas to develop applications, the following points should be grasped:
(1) Timeliness of the model: including the development model and the runtime model, and the runtime model shows the core idea of model-driven.
(2) The evolution of the model: It reveals whether the model can change itself according to the change of application.
(3) Hierarchy of the model: As the complexity of the system increases, the model can be composed of multiple levels.
Integration challenges
Data integration challenges facing IT organizations in the economic crisis
To survive the current economic crisis and become stronger, businesses must transform into data-driven businesses. They need to treat their corporate data as a valuable asset that can be used to support strategic and operational decisions. By becoming data-driven, businesses can operate more efficiently, better manage risk, improve customer service, make informed decisions faster, and keep costs low.
IT organizations have played a key role in this evolution. Businesses expect their IT teams to provide complete, accurate, consistent and up-to-date data when and where they need it. Data can not only effectively promote the key plan of "keep business running continuously" during the economic downturn, but also prepare enterprises for development and success when the situation improves in the future.
During the economic downturn, IT also faces severe challenges. How can your IT organization squeeze more value from available resources (human and technical resources)? Faced with a more rigorous budget review for each project, how can you accelerate deployment? How can your team always respond flexibly to changing business needs?
In short, how do you do more with less (responding to more projects with less money, less resources, and less time)? To meet these challenges, your IT organization needs to do three things:
Reduce costs
2. Operate more efficiently
3. Maximize the value of existing technologies
To achieve these three goals, IT organizations need to rely on a comprehensive, unified, open, and economical data integration platform.
Integration issues
[4] issues in IT organizations
Companies' current choices will determine whether they can weather the current financial turmoil. Executives of all types of companies are exploring:
Going global · How can my business achieve diversified development across different regions to reduce dependence on an economy?
Gaining market share · How can my business expand and acquire ambitious business through acquisitions?
Keep it simple. How can my business reduce overhead and win with mobility flexibility?
Keeping your business organized · How can my business ensure compliance with current and upcoming industry and government regulations?
This requires thousands of decisions. But there is only one secret to the success of every business rule: timely, complete, and accurate data. This is where IT is needed. Businesses expect their IT organizations to provide the data they need for their business, when they need it.
But it's easier said than done. Data is scattered across the enterprise-applications, databases, PDFs on desktops, Excel spreadsheets, and Word documents. It is also stored outside the corporate firewall-in the application "cloud" with software as a service (SaaS) and business process outsourcing (BPO) vendors and with trading partners.
IT organizations find the problem tricky. Each business rule generates a new IT scenario. Each new IT scenario creates a new IT project. Every IT project requires data-accessing data, migrating and consolidating data, and basically knowing the quality of the data.
Insufficient method
There are deficiencies in traditional data integration methods. They do not address the complexity of today's IT environment, nor do they cover the set of scenarios that IT must implement.
For different single point solutions that connect hundreds (or thousands) of applications, they simply split operational data and lock it into departmental applications, such as ERP and CRM. An application-centric approach to data integration does not consider all enterprise data. For example, they cannot process planning data, which is usually saved in Excel spreadsheets and not in departmental database applications. They also do not address data about BPO or SaaS vendors that reside outside the enterprise or data shared with trading partners.
The hand-coded data integration method also does not work. Manual coding is time-consuming, labor-intensive, and error-prone. As IT organizations strive to manage more data and more data formats, manual coding often results in more complexity--not simpler, as shown in Figure 2. It increases maintenance costs and reduces IT efficiency.
How does it perform in terms of data quality? Traditional data integration methods cannot guarantee that all data (customer data, material and asset data, and financial data) remains complete, consistent, accurate, and up-to-date, regardless of where the data resides.
If your IT organization continues to use traditional methods for data integration, that is, "islands" by department, application, or database, you will spend more time and money managing complex situations and "keep business Keep running "instead of focusing on new business rules.
Method characteristics
IT organizations need a reliable new approach to data integration-new approaches can:
l Integrate all internal preset data islands in the enterprise, including unstructured data
l Integrate external data in cloud computing applications and systems
l Seamless exchange of data with trading partners in a business-to-business format
l Ensure the quality of all data
l Cost-effectively manage the application life cycle
By the time companies asked their IT organizations to handle more data integration projects, they were already financially ready. Without aggressively cutting IT budgets, companies will examine each expense more closely. Companies are slowing their IT procurement cycles to be cautious in other areas. They are extending deployment time to assess total cost of ownership (TCO) and analyze potential return on investment (ROI). In addition, they are actively looking for ways to control costs and eliminate redundancy.
Facing the balance of these two opposing forces, your IT organization needs to increase ROI while reducing TCO. You can use three balancing methods:
Improve operational efficiency
2. Make full use of existing technology investments
3. Reduce development and deployment costs and operating and maintenance expenses
IT organizations can implement all of these methods at once through a data integration platform. As shown in Figure 3, the data integration platform is a comprehensive set of technologies, including access, discovery, cleaning, integration, and provision of data for the expanding enterprise.
The data integration platform supports various data integration projects, such as:
l Data warehouse
l Data migration
l Test data management
l Data archiving
l Data integration
l Master data management
l Data synchronization
l B2B Data Exchange
We'll see how a data integration platform can help your IT organization:
l reduce costs
l Operate more efficiently
l Make full use of existing technology investments
lower the cost
New data integration methods help businesses reduce costs
Today's closely reviewed IT budgets make cost a key consideration. Separate integration methods, such as manual coding or single-point solutions, may seem economical at first glance, but it quickly proved to be time-consuming and laborious to support such methods. Changing a single application or system results in a chain reaction that spans multiple integration points, creating unreliable results that necessitate additional cross-checks and manual cleaning.
In contrast, data integration platforms can significantly reduce the time and resources required for deployment, maintenance, and management. Easy-to-use, role-based tools and reusable libraries of development assets increase productivity and reduce deployment time. Normalized methods eliminate differences and make results more accurate. High scalability and simple management simplify maintenance and upgrades. This equates to a reduction in IT costs initially and over time.
From just "keep your business running" to "keep developing new projects"
A data integration platform can help your IT organization significantly reduce costs, moving from a simple "keep business running" to "continuously developing new projects."
See examples.
Assuming your IT organization uses the data integration platform for easy use and management, pre-built connectivity, reusable logic and rules, high scalability and performance, and seamless upgrades, it realizes the cost savings of the platform. In the end, you get the resources and budget to launch a single critical application that has been asking you for the past six months.
You face three basic issues:
1. How to migrate the required data from the old system to the new system and ensure that only useful, accurate and valid data is migrated in accordance with business requirements?
2. How to test whether the system is configured correctly before the migration failure occurs, how to test whether the system is working properly?
3. How can you make sure your application doesn't swell over time, so you need to buy more main storage, more database licenses, and more powerful processors to keep the system running efficiently?
Your data integration platform will be your recipe.
First, you need to accurately define the data that is important for migrating from legacy applications to new applications. With the data integration platform, you can identify old data and new data structures and quickly build mappings to new systems. You can continue to use these mappings because you may move data into or out of the system quickly.
Second, you need to test and configure the application. With a data integration platform, you can select only the most relevant business data to quickly copy and refresh specific production data that meets your needs. This method greatly reduces the time, effort, and disk space required compared to creating a full system / database backup.
Finally, after the application is fully up and running, you need to migrate inactive data from the new application to a secure archive to maintain a stable state of mission-critical applications in terms of storage, database licenses, and performance. With a data integration platform, you can easily identify and move inactive data for long-term retention online or offline, and you can access archived data at any time.
Efficient operation
New data integration methods help businesses operate more efficiently
As companies increasingly view data management as a business issue, not just an IT consideration, minimizing the complexity of multiple tools, skill sets, and vendors is especially critical to improving productivity. Many IT organizations need to understand this important lesson. They try to handle multiple data integration projects, however, the approach taken for each project is still based on a "special" basis. Because each project uses different tools and methodologies, and it does not take full advantage of the lessons learned and learned from past projects, it often ends with high cost, complexity, redundancy, and unreliability.
Data integration platforms help IT organizations operate more efficiently by increasing productivity. The platform frees IT from having to do duplicate work on every project. Instead, IT can share methods, technologies, and assets, such as logic and metadata, across all projects.
When you standardize data integration practices on the platform and then create an Integration Competency Center (ICC) or Center of Excellence, you can achieve significant savings in development time and cost of integrated applications and data interfaces, and maintenance costs .
Data integration also involves many different roles-from data administrators and business analysts to data architects and IT developers-each performing his own role and doing his best. IT and business need to work together to respond to changing business needs in a faster and more affordable way.
A unified data integration platform enables IT and business units to collaborate more effectively. The platform provides a tool set with consistent interface and user experience, so that each part of the tool set can be used seamlessly in multiple projects. These tools are tailored for each feature, so people in each role can focus on their area of expertise and improve their skills more quickly. Individuals involved in data integration spend less time understanding the platform, so they can spend more time on their jobs.
Lifting technology
New data integration methods help companies increase technological value
In the current economic environment, every technology investment is subject to strict scrutiny. IT organizations need to make the most of existing technology. With a data integration platform, IT organizations can continue to use legacy systems and applications, avoiding the waste and risks associated with "retirement and replacement."
In addition, the data integration platform allows IT teams to reuse assets between projects, reducing TCO and spending on training staff and developing skill sets. Adopting the same processes and methodologies across multiple projects allows companies to start with small projects-such as a single data warehouse project-and then easily expand the scope as needed. First, IT only needs to adopt the specific data integration tools necessary for the current project. Then, as new projects emerge, IT can take advantage of the platform's common engine, user interface, and metadata, as well as ready and trained users, to quickly and cost-effectively embrace these new projects.
Integrated platform ideal data integration platform
Data integration platforms must address the issue of data fragmentation between enterprises in order to make data-driven business decisions faster and conduct business operations more effectively and efficiently. It must provide services as the foundation of an enterprise's technology and provide an easy-to-control method to integrate data.
To meet these needs, a data integration platform must have four characteristics: comprehensive, unified, open, and economical.
Supports complete data integration lifecycle
The data integration platform must support all five key steps in the data integration lifecycle: access, discovery, cleaning, integration, and delivery (see Figure 4).
Step 1: Data that accesses most organizations is stored in thousands of locations, not just inside the enterprise, but also in the "cloud" of business partners or SaaS vendors outside the firewall. Regardless of the source or structure, all data must be accessible. Data must be extracted from secret mainframe systems, relational databases, applications, XML, messages, and even documents such as spreadsheets.
Step 2: Discover the source of the data-especially if the records are incomplete or unknown-must be probed to understand its content and structure. Need to infer patterns and rules in the data. Potential data quality issues must be flagged.
Step 3: Cleaning Data must be cleaned to ensure its quality, accuracy, and completeness. Errors or omissions must be addressed. Data standards must be enforced and values must be validated. You must delete duplicate data entries.
Step 4: Integration To maintain a consistent view of the data across multiple systems, the data must be integrated and transformed in order to reconcile differences in how different systems define and structure various data elements. For example, for "customer profitability", marketing and financial systems may have completely different business definitions and data formats, and these differences must be resolved.
Step 5: Delivery must deliver the right data in the right format and at the right time to all applications and users who need the data. The data delivered ranges from a single data element or record that supports real-time business operations to millions of records for trend analysis and enterprise reporting. You must ensure high availability and delivery security for your data.
In addition, the data integration platform must:
Auditing, managing, and monitoring data administrators and IT administrators need to collaborate to audit, manage, and monitor data. Continuously measure key indicators, such as data quality, and these indicators will steadily improve over time. This is to track the progress of key data attributes and flag any new issues so that they can be resolved and continuously improved after the data is passed back into the data integration lifecycle.
Define, design, and develop Business analysts, data architects, and IT developers need a powerful set of tools to help them collaborate on defining, designing, and developing data integration rules and processes. The data integration platform should include a common set of integration tools to ensure that all people work effectively together. Implement any data integration project
The data integration platform must be reliable, flexible, and scalable to handle any type of data integration project, including:
database·
data migration·
Test data management and archiving
Data Integration·
Master Data Management ·
data synchronization·
B2B Data Exchange
From a single department data warehouse project to a global data migration project, your IT organization can expand many types of data integration projects at once. Your team needs to be able to start with a small project type and reuse the same technologies and assets in subsequent projects-by sharing metadata.
The data integration platform needs to be able to handle analytics data integration (reporting and analysis), as well as operational data integration (business processes related to operational execution).
Provide data at any cycle
For data integration, there is a wide range of time frames and cycle requirements, depending on the application and use case. Some projects require data integration on a monthly or weekly basis; others require integrated data on a secondly basis. IT organizations need the flexibility to change cycle requirements without having to rebuild the entire infrastructure.
As shown in Figure 5, the ideal data integration platform must provide support throughout the cycle, providing trusted data at any time based on application or user needs-whether in real-time, batch, or change data capture (CDC).
Unite
A single unified data integration platform can greatly simplify the work of the IT team. When you have all the data integration capabilities your extended enterprise needs (from a single vendor), you maximize your productivity through role-based collaboration, shared metadata, and a single, unified runtime engine.
Role-based collaboration
Data integration projects include IT and business people who play multiple roles. Each character needs a different set of tools specifically designed for it. At the same time, project team members must work together to share work and tasks to improve cross-team productivity and ensure IT and business unit coordination.
As shown in Figure 6, the ideal data integration platform provides role-specific tools specifically designed for each person's skills and tasks. These role-specific tools have a consistent interface. These tools have the same interface and feel, and are integrated with each other. Therefore, they are easy to learn and use. By reusing assets across different data integration projects, team members can get up and running quickly and stay productive.
Shared metadata
The data integration platform must provide shared metadata. Every tool within the platform must have access to metadata about where the data is stored and the business rules and logic associated with it. With shared metadata, everyone can work on the same thing together. Analysts and developers can work with different types of metadata or view the same metadata in different ways and still maintain effective collaboration. Metadata is consistent, and everyone can easily see the impact of potential changes.
Unified runtime engine
The key to a data integration platform is a single runtime engine. The individual products that make up the platform should all run on the same engine that simplifies implementation, management, and maintenance. A single engine makes it easier to upgrade multiple versions. The platform must be designed for enterprise-level deployments with reliable scalability, availability, and security, so you can rest assured that your business is on the platform.
open
An open, neutral data integration platform is designed to be compatible with everything in your current IT environment-your hardware, software, technical standards, and anything you want to add in the future. An open platform protects your business from the risk of supplier bottlenecks.
Access data from any source
Most organizations store data in hundreds of different formats: enterprise applications, databases, flat files, message queues, spreadsheets, and other documents. As shown in Figure 7, the data integration platform must handle any data type or format, including structured and unstructured data from any source and all master data types, such as customer data, product data, and financial data.
The data integration platform must be able to access data that resides outside the enterprise. This includes data from multiple business entities and data distributed across many different geographic locations and countries.
reduce risk
The IT landscape is changing. This leads to uncertainty. IT organizations need to adopt strategies to mitigate the risks of this change. You need a data integration platform that supports all current technical standards from operating systems to databases. It must be open to ensure compatibility with everything that is currently or may be configured in the future. This includes all of the various applications and data sources in your business and in the "cloud" or partners.
economic
An economical data integration platform delivers the lowest possible total cost of ownership (TCO) and the fastest and highest return on investment (ROI). In the current tough economic environment, each technology investment is subject to rigorous review to assess its ability to help IT organizations and businesses, so these factors are particularly important:
lower the cost·
Operate more efficiently
Generate value quickly
Lower total cost of ownership
Data integration platforms must provide easy-to-use tools and reliable scalability and performance to reduce upfront costs, reduce ongoing maintenance and management costs, and generate value quickly. Enterprises can deploy platforms for specific data integration projects and then extend the platform to solve other projects without having to spend money on other tools or training. In short, a data integration platform can make your IT organization do more with less.
Faster return on investment
Getting a fast return on your investment in a data integration platform depends on your ability to move quickly and put it into use. You need to increase IT resources.
More than three times as many developers know Informatica as they know any other data integration software on the market. Therefore, it is easier to find skilled and affordable Informatica resources to help you complete your project. Another way to accelerate return on investment is to create an integration competency center to support more integration options across the enterprise.
Application platform
Informatica Platform in Application
Now let's see how the Informatica platform helps four companies in different industries and geographic locations to increase productivity, maximize technology investment benefits, and reduce costs.
T. Rowe Price is a multinational investment management company that holds more than $ 334 billion in assets in a wide range of mutual funds. As the financial industry becomes more complex, more competitive, and more stringently regulated, companies need to manage more data more effectively. To improve customer service, ensure a consistent IT environment, and comply with data governance regulations, T. Rowe Price decided to create an ICC powered by the Informatica platform. The company developed standards, security policies, and publishing methods, then created data management procedures to recruit participants from the business and IT departments. Starting with the data warehouse, the company gradually expanded the use of the Informatica platform to other integration projects. In the end, T. Rowe Price achieved these results:
1. Higher employee efficiency. The IT team started 12 data integration projects simultaneously in the first year. By the fifth year, this record had been increased to processing 60 items at a time.
2. Maximize technology investment. Accelerating reuse by standardizing processes and processes has resulted in cumulative benefits and cost savings of up to twice the ICC team's own costs.
3. Reduce costs. T. Rowe Price used the Informatica platform to start to realize the net benefits after cost recovery in the second year, and achieved considerable benefits in five years. Most of the savings are due to cutting costs for new developments, ongoing code maintenance, and impact analysis.
Duke Energy merged with a competitor in 2006 to become one of the largest power holding companies in the United States, with more than 4 million customers in Carolina, Kentucky, Ohio, and Indiana. As the merger completes, the utility company will then need to integrate a number of disparate data sets that are widely dispersed. It also needs to ensure that you have the consistent, accurate, and timely business information necessary to maintain efficient operations.
Duke Energy turned to the Informatica platform to create best practices, cut costs, and accelerate time to market. By eliminating point-to-point interfaces and creating an integrated data management architecture, the company successfully completed the merger and paved the way for future acquisitions.
With the Informatica platform, Duke Energy can:
1. Improve operational efficiency. With a single data integration platform that removes data management and reporting from corporate trading systems, Duke Energy can provide managers with advanced views of all types of data more quickly. It has also completed more projects: in the first six months after deployment, 31 projects have been inspected and 8 projects implemented.
2. Maximize the benefits of technology investment. Because the Informatica technology platform is designed to be compatible with a wide variety of source systems, Duke Energy can easily scale to integrate data from future mergers without disrupting business reporting.
3. Reduce costs. Duke Energy will save $ 1.5 million annually from consolidation, centralization, and reduction in operating costs. In addition, it is expected to save an additional $ 3 million in operating and maintenance expenses over the next two years, and the cost of completing the next merger and acquisition is expected to be half that of the previous one. KPN is a $ 19.5 billion phone, internet and television service provider in Western Europe with operations in the Netherlands, Germany and Belgium. This communications company wants to provide quality service to its more than 35 million customers-but all types of customer data are stored in more than 50 separate applications, and sales and service representatives cannot always understand who they are talking to, let alone consider how Customer provides help or additional services. To improve customer service and operational efficiency, KPN decided to integrate all customer data across multiple business units with functionally separated systems. As a long-time Informatica customer, KPN decided to extend the Informatica platform to clean, synchronize, and load all its master data into the new CRM solution.
With Informatica, KPN employees now have a unique, comprehensive view that is up-to-date and presents every customer relationship. In the end, the company achieved these goals:
1. Higher efficiency. Because accurate, real-time data can be quickly accessed in the call center, customer service representatives can spend 10% less time processing each call, and still be able to conduct cross-selling and sales promotion more effectively, increasing work efficiency by 5%, The average revenue from each user increased by 5%.
2. Higher technology investment returns. Since KPN has used the Informatica platform elsewhere in the enterprise, KPN simply extended the platform to new projects. KPN can easily complete CRM implementation on time and on budget to realize value quickly.
3. Reduce costs. Real-time access to detailed customer data enables KPN to reduce customer churn by 10% per year. In addition, improving and automating data quality reduces IT maintenance costs.
Enterprise's goal
Goals for IT organizations transforming into data-driven businesses
Those that have successfully weathered the economic downturn are those that can respond to changing circumstances. When the competitive landscape, markets, and economies change, these companies can act quickly and take full advantage of opportunities.
These companies need data-getting the right data at the right time with unquestionable quality. According to Gartner, "The strategic use of information determines an organization's ability to compete and win. 2 These companies depend to a large extent on their IT organizations. IT departments play a key role in helping their companies become data-driven enterprises .Comprehensive, unified, open, and economical data integration platform enables IT departments to cope freely. Such a data integration platform provides a solid foundation for more efficient, effective and affordable data access. It is the lifeline of timely and trusted data flow This allows IT organizations to support companies through the economic downturn, making them stronger, more flexible, and more competitive when the economy improves.
The Informatica platform can help you transform into a data-driven enterprise by enabling your IT organization to:
· Access, discover, cleanse, integrate and deliver trusted data to your expanding business in a timely manner-any data, anytime, anywhere
Support all roles involved in the data integration process
Handle all types of data integration and data quality projects
Compatible with all existing systems and processes that may be added in the future
After thousands of actual deployment verifications, the Informatica platform can indeed help IT organizations reduce costs, increase efficiency, and bring more value to the business.

IN OTHER LANGUAGES

Was this article helpful? Thanks for the feedback Thanks for the feedback

How can we help? How can we help?