Restricted access

October 15, 2008

Customer Data Migration Between Sources

Filed under: Data Cleansing, Data Migration, Data Quality — Alena Semeshko @ 5:49 am

Moving contacts from one CRM to another? PowerObjects Microsoft CRM blog has tips on how this process should be planned and handled so as not to lose any data somewhere in the middle.

The blog suggests that you first take a look at your current system and determine whether it’s really worth moving your data to a new system. If after thorough consideration you still think of migrating, move to the next step. That is, define the relationships and determine exactly what information needs to be migrated. After getting a hang of what needs to be done and identifying the efforts and costs associated with migration, you are ready to reherse your migration.

I’d also add that, as usual when dealing with customer data, migration from one CRM to another is a great chance to improve your data quality by checking its accuracy and consistency before sending it to your new system.

July 16, 2008

Simple Solutions to Huge Data Management Problems

Filed under: Data Cleansing, Data Integration, Data Migration, Data Quality — Alena Semeshko @ 2:42 am

Ponemon Institute surveyed 870 IT professionals and found 23 per cent of respondents admit that their data is often left unsecured and inadequately protected.

The problem usually lies in the way unstructured data is spread across the organization’s knowledge management systems, corporate applications (CRM/ERP systems), databases, files etc. and the lack of a clear vision of how it should be consolidated. Recent Gartner Group research supports this with the figures of as much as 80 percent of actual or potentially mission-critical enterprise information taking the form of unstructured or semi-structured data.

Integration, migration, synchronization, data cleansing… it’s all already out there, why not make use of it?

April 25, 2008

Good Customer Data is a Must-Have

Filed under: Data Cleansing, Data Integration, Data Quality, ETL — Alena Semeshko @ 12:58 am

Making the most out of your customer database and relations management solution is what every company wants. No doubt about that. Nonetheless, a huge number of CRM approaches prove insufficient and inefficient.

Here are the six aspects of CRM deployment that Richard Boardman in his recent article calls essential:

1. Poorly defined requirements
2. The availability of internal staff
3. Sign offs
4. Data Good systems require good data, and, if the new system is to be populated with existing data, it’s important that the quality of that data is high. Many organisations are surprised at how many data sources they possess and how poor the data quality is. The cleansing of data and reconciliation of different versions of the same record in multiple data sources can be very time consuming. While there are tools that can help, this process tends to be very manual, and is not something that can be fully outsourced as it requires considerable input from the data owners.
5. User acceptance testing
6. User adoption

I still think data is the key element in this. It’s how you approach, structure and work with your data that makes a difference in your company’s progress. I’d break number four into more precise items like
1. Well-defined data requirements
2. Customer Data Integration & Data Quality (including ETL, data cleansing and everything related to it)
3. Data management, that among other includes following through with your requirements and cleansing procedures rather than adopting a once-in-a-lifetime/lifecycle (whatever you wanna call it) scheme.

But I agree with Richard, you still need to be “realistic about the demands these projects will place on the organisation and manage expectations accordingly. Too often CRM projects are deemed failures because they failed to meet impossibly demanding and often self-inflicted deadlines. A better review of what’s involved and a more analytical appraisal of the availability of resources to meet those demands will go a long way to ensure project success.

April 22, 2008

Data Quality At Large

Filed under: Data Cleansing, Data Quality — Alena Semeshko @ 10:35 pm

What’s data quality for you? Right customer contact information in your CRM? Think again? Data quality is more than that, much more than that. Product numbers, associated descriptions, part numbers, units of measure, medical procedure codes and patient identification numbers, telephone numbers, email addresses, commodity codes, vendor numbers and vehicle identification numbers, the list goes on.

This article in CXO describes some consequesnes of poor data quality:

For the CEO, whose ultimate responsibility is to increase customer retention and loyalty, the effects of poor data can have long-term, devastating consequences. For example, the inability to eliminate redundant name and address records results in additional mail-order campaign costs. Recipients of duplicate mailings are also likely to become frustrated and question the firm’s overall operating efficiency. If these redundant mailings each consistently misspell the individual’s name or address, the frustration level is likely to approach alienation or even a legal concern – especially if the recipient had previously made a request to the mailer that they be removed from the vendor’s mailing list or asked to be placed on an industry-wide, do-not-mail list.Add to this the cost of the catalogs or merchandise delivered to the wrong address and the real magnitude of the problem only just begins to surface. If a single customer is included in a company’s database multiple times, each time with a different value for the customer identifier, the company will be unable to determine the true volume of this customer’s purchases. It could even be placed in the embarrassing situation of attempting to sell the customer an item that he or she has already purchased. Poor data quality can negatively influence how a company is perceived in the marketplace and damage brand equity.

These data inefficiencies can also result in missed up-sell and cross-sell opportunities. Without a single view of the customer across the enterprise, it’s impossible to aggregate information to make decisions. This makes it impossible to distinguish between single-product and multi-product buyers, or between new and existing customers

For the CFO – who is in charge of regulatory compliance, managing security risk and other methods of limiting exposure – poor data can result in the company facing public embarrassment, loss of credibility, significant fines and even lawsuits. A forward-thinking organization should include data quality as a part of its everyday operations. While this may not happen overnight, recent regulatory and Homeland Security initiatives such as the U.S. Department of Treasury’s Office of Foreign Assets Control (OFAC), Sarbanes-Oxley, the U.S. Patriot Act, and the Health Insurance Portability and Accountability Act (HIPAA) can quickly spur a company to establish a solid data foundation.


For the CIO, who spends his days striving to achieve peak operational efficiency, inferior data quality can lead to missed opportunities to negotiate better rates with suppliers. Large companies can have thousands, or even millions, of suppliers. Unless you have precise data on how much total business you are conducting with a single vendor across all divisions, you are likely to pay too much for their service.

So what do you do to improve? The article suggests the following:

First,  conduct a Data Quality Assessment to help you recognize the severity of data quality issues.

Second,  adopt a well-defined Data Governance Plan across your organization. That is, define who owns the data, who is authorized to access the data, and which specific standards should apply to the data.

Third, choose a technology to serve as the backbone for the intelligent use and preparation of relevant customer data.

Sounds short and sweet, but try following it through. Will take a while, but you won’t regret it.

April 6, 2008

New CIO at StrikeIton

Filed under: Data Cleansing, Data Quality — Alena Semeshko @ 9:59 pm

News from here, emphasis mine.

RESEARCH TRIANGLE PARK, N.C.–(BUSINESS WIRE)–StrikeIron, Inc., the leader in providing innovative solutions for delivering data over the Internet, today announced that David Linthicum will take the helm as company CEO. Linthicum will be responsible for continuing to drive the companys leadership position as the frontrunner in delivering critical Web services and data, on demand, for the emerging next-generation Internet. Bob Brauer remains as president and co-founder and will continue to lead the day-to-day operations of StrikeIron.

StrikeIrons revenue more than doubled from Q107 to Q108 and has tremendous momentum in the industry. Were moving beyond simply delivering data as a service and into a new era of growth and development for new innovative products, stated Brauer. As an industry thought leader and visionary, Daves addition to the StrikeIron team helps us take the appropriate steps to deliver on the promise of Service Oriented Architecture via the Web and building the foundation for Web 2.0 applications with our managed Web services platform. We are confident that under Daves leadership, StrikeIron is well-positioned to go to the next level.

The emerging Web is an exciting medium that has come of age. Web services and mashups are changing how we access and deliver information and StrikeIron has established themselves as one of the driving forces in the industry, stated Linthicum. I look forward to building on the success StrikeIron has already achieved to date.

A quick reminder - as a result of a recent partnership agreement with StrikeIron, Apatar has recently released two connectors to StrikeIron’s data quality services: StrikeIron US Address Verification connector and StrikeIron E-mail Verification connector. These data quality services from Apatar and StrkireIron ensure the validity of your data, increase productivity, improve sales strategies, and take customer service to a new level by providing faster transaction processing and higher accuracy.

March 26, 2008

Data Quality Ups and Downs

Filed under: Data Cleansing, Data Quality — Tags: — Alena Semeshko @ 3:53 am

Everyone seems to be discussing a recent QAS data quality survey entitled ‘Contact Data: Neglected asset seeks responsible owner’ that questioned over 2,000 organizations worldwive and revealed an increasing number of businesses taking data quality isses seriously and bringing it up to the boardroom level.

“Within the past three years, the number of businesses where the responsibility of data integrity has risen to boardroom level has soared by 16 per cent, showing how important an issue accurate data has now become.”

The survey also stated that:

* the number of employees directly involved in the data quality management has increased by 5% only in the last year
* 23% of the businesses that participated in the survey claimed to use strategica data planning applications on daily basis
* 46% have their own documented data quality strategy

These increasing numbers sure are encouraging and if the growth persists, or even speeds up a bit, we might see a conceptually new, better, cleaner data emerge as an accepted standard of data quality. Now that would be nice, wouldn’t it?

However, with the survey showing 34% of respondents not validating any of their customer and prospect data, there’s still a long way to go to reach the “standard” I’m talking about.

QAS group operating officer Jonathan Hulford-Funnell says: “I find it incredible that organisations are not paying more attention to data quality. It shouldn’t be seen as a burden for middle management, it should be something that every employee in the business takes responsibility for.”

March 21, 2008

Stop Accusing IT for Dirty Data

Filed under: Data Cleansing, Data Quality — Tags: , — Alena Semeshko @ 4:19 am

IT is the easiest to blame for drawbacks and holes in your data, that’s no news. Whenever you don’t get the results and the information you need (provided your business processes are set to present you with quality data), you naturally start looking for someone or something to blame. And IT seems to be the perfect scapegoat. Little do we realize that the problems lie in the business, not in IT.

The thing is, we associate data with IT, consider it a part of IT and don’t realize the two are totally different. Gartner research VP Ted Friedman suggest the solution that should keep the blame off of IT and cause less data quality problems:

“Business needs to be in the driver’s seat,” Friedman said. “At the moment we feel that the focus on the topic is way way too much in the IT camp.”

To advance data quality, Friedman suggests the use of a data steward, who is responsible for benchmarking current levels of data quality and measuring the impact on the business of bad data. The data steward looks at the data transfer processes, making sure, for instance, that the data passes through as few people as possible.

Data stewards will come from a business background, but have good relations to IT, Friedman said. They will only be effective if they are held accountable for their progress, and receive bonuses for meeting quality targets.

March 20, 2008

5 things to Watch out for in Data Warehousing

Filed under: Data Cleansing, Data Integration, Data Quality, Data Warehousing — Tags: — Alena Semeshko @ 7:45 am

There’s been talk of the concept of data warehousing being misleading, failing to deliver efficient solutions at the enterprise level and frequently causing problems upon implementation. Problems like that, again, don’t come out of nowhere, there usually are good reasons behind them. In this post I’l try to sum up a few things you should definitely try to watch out for when tackling your data warehouses:

1) First and foremost – Data Quality. When your data is dirty, outdated and/or inconsistent upon entering the warehouse, the results you are gonna get won’t be any better, really. Data Warehousing is not supposed to deal with your erroneous data, it’s not supposed to perform data cleansing. These processes need to take place BEFORE your data gets even close to the warehouse, that I s, your data integration strategy needs to address low quality data problem.

2) Come to think of it, Data Integration is the second thing to watch out for. Do your integration tools live up to your requirements? Can your software handle the data volumes you have? Will it comply with the newly added to your warehouse source systems and subject areas? How high is the level of automation of your integration system? Can you avoid mannual intervention? You gotta ask yourself all of these questions before you complain that your warehous isn’t providing you with the quality of information you expected.

3) Next, dreaming too big. When you build sand castles you gotta realize they’ll disappear in a matter of days, even hours. Your can’t have it all and at the same time, you can’t have your pie and eat it too. Brreaking the project into small segments, giving them enough time to deliver and having patience is the key to having a pleasant experience with your data warehousing solution. What? Did you think you can fix all the mess in your data in a matter of days? =)

4) Then, don’t go rushing into solutions. Don’t panic. Yes, warehouse projects require time and effort on your part. Yes, it’s gonna be complicated at first. But that’s not the reason to stop with one project and rush into another. Stick with your first choice, fix it, work on it. Multiple projects will waste your resources and end up as another silo aimlessly taking up your corporate resources.

5) Finally, make sure you have a scalable architecture that you can redesign according to your increasing needs. Your business grows, sometimes grows quicker than you think (the number of customers increases, they have more information, more data to be processed) and you want your solution to continue to perform on the same level and live up to your expectations.

The list goes on actually, as there are more things to watch out for… but these are the first that come to mind. =)

March 19, 2008

BA or BI 2.0?

Filed under: Data Cleansing, Data Quality — Alena Semeshko @ 7:14 am

The questions have been up lately on whether the term Business Intelligence has outlived its practical side and whether Business Analytics is a more appropriate term. The opinions regarding the relationship of BI and BA split. Some say Business Analytics makes up just one part of, or a niche within this large concept of Business Intelligence. Others consider BI too vague of a term and feel more comfortable with Business Analytics as a definition for the new sophisticated data quality, data integration and etl tools.

The proponents of the second “theory” say Business Intelligence is changing the way businesses work and think. Bi here implies not only moving data around and producing reporting services, but also keeping pace with the constantly changing and dynamic business requirements.

So, what is analytics? Neil Raden of Hired Brains, a market research and management consulting firm, has said that, “the proper term for interacting with information at the speed of business, analyzing and discovering and following through with the appropriate action, is ‘analytics’.”

Well, the opinions may differ, but regardless of what you call it, be it Business Intelligence or Business Analytics, the data quality services Apatar provides play an integral role in interacting with information, merging, transfering and validating it - everything that both BI and BA are all about.

Take a look at how its data quality services can validate and improve your customer data and get it clean and easy to work with. Or browse data quality web demo over here.

March 12, 2008

Data Migration Talk

Filed under: Data Cleansing, Data Migration, Data Quality — Tags: , — Alena Semeshko @ 4:37 am

CleanUp! You first hear these words as a kid from your parents. Clean Up! When you hear this you usually know you’ve made a mess. Clean Up! This is what you shouldn’t be hearing, or, for that matter, thinking, in regards to your company’s data. Or, at least, if the prospect ever crosses your mind, it shouldn’t look as nasty and unpleasant as it used to in your childhood. =)

But nonetheless, clean up you should. If your source systems and initial data are a mess, of course. The obcession with clean data is only justified in this world of Business Intelligence, where looking at the picture as a whole and thinking big is not an encouraged, yet infrequent occurance anymore, but a requirement.

One of the key elements to having your data clean and having a global view of your organization’s lifecycle is data migration. Wise data migration with an appropriate strategy and the right tools, not the sort where you splash money and remain in the same spot you started.

Anyway, a whitepaper I came across got me thinking about this, so you can download it and check it out for yourself over here. It’s called The Hidden Costs of Data Migration and it touches upon the issue of data migration, whether to employ it or not, and the costs associated with it.

Data migration has become a routine task in IT departments. However, with the need for critical systems to be available 24/7 this has become both increasingly important and difficult. This White Paper will outline the factors that are driving data migration and examine the hidden costs that may be encountered when data is moved.

« Older PostsNewer Posts »