SAS White Paper

download SAS White Paper

of 15

Transcript of SAS White Paper

  • 8/2/2019 SAS White Paper

    1/15

    Bolstering businessintelligence with a sounddata integration planBusiness intelligence (BI) systems must have a solid data management infrastructure

    to be effective, trusted and adopted by organizations. Since most BI initiatives require

    information from multiple source systems, data integration is a critical consideration.

    BI professionals designing data integration processes must address difficult questions:

    How will data from multiple sources be reconciled and loaded into the data warehouse

    and/or BI system? How often must this occur? Does the system require audit trails,

    so data lineage can be traced back to the originating source systems? The issues are

    challenging, but not insurmountable. In this E-Book, learn more about choosing the

    right data integration style for your BI initiative. Find out specific data integration

    considerations for BI programs and learn about new tools and methods that may help,

    such as real-time data integration. Read more about the pros and cons of hand-coding

    for data integration and get a top analyst firms take on the leading vendors and

    trends in the data integration market.

    Sponsored By:

    E-Book

    TechTargetterprise Applications

    Media

  • 8/2/2019 SAS White Paper

    2/15

    Bolstering business intelligence with a sound data integration plan

    Table of Contents

    Sponsored by: Page 2 of 15

    Table of Contents:

    The real deal on data integration for business intelligence

    The three paths to real-time data integration

    Gartner data integration Magic Quadrant 2007: Platforms, market expand

    Data integration software vs. hand coding: Balancing costs and benefits

    Resources from SAS

    E-BookBolstering businessintelligence with a sounddata integration plan

  • 8/2/2019 SAS White Paper

    3/15

    The real deal on data integration for business intelligence

    By Jeff Kelly, SearchDataManagement.com News Editor

    There are various ways to sabotage a business intelligence (BI) initiative, but perhaps none is more effective than

    ignoring key data-integration considerations at the outset.

    Data integrationconnecting data sources with data consumerslies at the very foundation of BI, according to

    experts. But successful data integration is no easy task. Poorly labeled data fields and siloed data sources, among

    other obstacles, see to that. BI done right requires carefully selected data sources, particular attention to data

    quality, and an understanding of the ultimate business uses of data.

    Business intelligence is all about data integration, said William McKnight, senior vice president of information

    management at East Hanover, N.J.-based consulting firm Conversion Services International. You cant have effec-

    tive BI without good data integration or else youll be running off different sets of books and encountering various

    semantic and other culturaland very real[obstacles] to success.

    Ensuring a successful BI initiative requires attention to some key data integration considerations, according to

    McKnight and Rick Sherman, founder of Stow, Mass.-based business intelligence consulting firm Athena IT Solutions.

    Get a grip on data granularities

    BI users often want to drill down deep into data to gain better insights on the business. The first step in any BI

    data integration project is to understand the granularities of an organizations data in order to agree upon defini-

    tions, McKnight said. That way, when users delve into the data, they are all working off a common framework.

    In a retail company, for instance, what exactly defines a customer? Is it a single person or the entire household? If

    another member of that household makes a purchase, is he or she a new customer or part of the existing house-

    hold customer account? Questions like these must be addressed before integrating data into a BI system, McKnight

    said. How they are answered will largely depend on a companys business model and corporate culture.

    Not all data sources are created equal

    The next step is to take an inventory of data sourcesdatabases, data warehouses, spreadsheets, etc. It is impor-

    tant to realize that different departments often have their own internal databases and spreadsheets that contain

    overlapping information, which can lead to problems with data quality.

    A lot of companies have grown up in silos and now are looking to integrate data to get a holistic view of the

    business, McKnight said. That means there are different cuts of the data taken off in different departments, some-

    times with their own IT departments or their own rogue IT departments. Before data can be integrated, he said,

    these various data sources must be reconciled.

    Bolstering business intelligence with a sound data integration plan

    The real deal on data integration for business intelligence

    Sponsored by: Page 3 of 15

  • 8/2/2019 SAS White Paper

    4/15

    Athenas Sherman also stressed the importance of data quality in any integration initiative. Prior to the implementa-

    tion of a BI system, he said, users in separate departments probably routinely make changes to data to suit their

    purposeschanges that are not relayed to other business units. Thats fine when the data never leaves a given

    department, but can wreak havoc on BI systems that access enterprise-wide data.

    They massage the data to make up for data quality issues, or gaps in the data, or things they dont think are

    right in the data, Sherman said. Once you [undertake] a BI initiative, then you want to look at more detailed

    data, from across different systems. Thats when you start to see data quality issues.

    Foster buy-in from BI stakeholders

    In conjunction with addressing data quality issues, executives and analysts should also be consulted to determine

    how they plan to use the resulting BI reports, analytics and dashboards.

    You need to pick data sources appropriately for the stakeholders of BI, McKnight said. And they should under-stand the choices that they have to pick from within the organization.

    Understanding how data will be used is essential to successful BI data integration, Sherman agrees, but under-

    standing data lifecycles is equally important.

    The lifecycle of your data is very closely related to your business processes, Sherman said. Data used by a

    manufacturing company, for instance, undergoes very different processes throughout its lifetime than data used

    by a retailer. Even within industries, data lifecycles often vary. Both Sherman and McKnight said knowing when to

    integrate data for BI purposes during its lifecycle is a key success factor.

    Only after data source options are clear, data quality issues have been addressed, and BI business uses deter-mined, can an organization actually begin integrating data with BI systems, McKnight and Sherman agreed.

    The reason why some BI projects take so long, sometimes nine months to a year or longer, is due to getting

    business users to determine how theyre going to define and use the data, plus assuring data quality and writing

    data governance rules, Sherman said. Business intelligence really raises the bar for data integration.

    Bolstering business intelligence with a sound data integration plan

    The real deal on data integration for business intelligence

    Sponsored by: Page 4 of 15

  • 8/2/2019 SAS White Paper

    5/15

    They cant help having tunnel vision.But you can. With proven business intelligence and analytic software from SAS.

    www.sas.com/gophers

    SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. indicates USA registration. Other brand and product names are trademarks of their respective companies. 2008 SAS Institute Inc. All rights reserved. 00000US

    Gophers burrow through life without seeing the havoc they create

  • 8/2/2019 SAS White Paper

    6/15

    The three paths to real-time data integration

    By Jeff Kelly, SearchDataManagement.com News Editor

    Companies are increasingly turning from traditional batch-oriented techniques to real-time data integration to elimi-

    nate the scourge of out-of-date data. Real-time data integration can be achieved through a variety of methods, but

    the goal is the same: to communicate accurate, timely data from point A to point B in real time so users can make

    better-informed business-critical decisions. Experts agree that real-time data integration is gaining popularity but

    also warn that it is not a methodology to adopt lightly.

    Recognize that the world is not a black-and-white place, said Ted Friedman, vice president and distinguished analyst

    with Stamford, Conn.-based Gartner Inc. Any given company is going to have data integration requirements that

    span the latency spectrum. There are going to be pieces that are best suited to be delivered in a high-latency, batch-

    oriented mode, and there [are] going to be other things where real-time data integration really does have value.

    Real-time data integration options

    The most common real-time data integration method is change data capture (CDC), also called data replication.

    CDC tools and technologies recognize when an important change has occurred in one data source and, in real time,

    transmit the change to a given target.

    Bloor Researchs Philip Howard explains: As a change is made to a database record in your transactional system,

    for instance, its also actively captured and fed through to your data warehouse or business intelligence system, or

    whatever youve got running, so its ready to answer real-time queries.

    CDC is used most often to synchronize operational applications and for real-time business intelligence (BI) purpos-

    es, according to Gartners Friedman. Indeed, business intelligence is a major driver of real-time data integration

    adoption, he said, especially among businesses that require BI reports at a moments notice.

    If youve got some type of short-cycle business and you need up-to-the-second analysis of how your supply chain

    is performing, [for example] Friedman said, then you need to be delivering data from some data sources to your

    BI application in more of a real-time fashion.

    CDC is less ideal, however, if the goal is a comprehensive, real-time view of a single entity via data housed in multiple

    sources. For that, users more often turn to data federation, sometimes called enterprise information integration.

    Data federation is better suited to people that are looking at a more narrow slice of the data landscape,

    Friedman said. They want to get a complete view of a single instance of an entitya customer, a product, an

    employeeas opposed to somebody whos doing historical trending in the data warehouse.

    For example, an insurance agent on a customer call might use an application supported by data federation technol-

    ogy to search multiple data sources to obtain a comprehensive view of that customer while still on the call. That

    [data] needs to be in real-time, Friedman said.

    Bolstering business intelligence with a sound data integration plan

    The three paths to real-time data integration

    Sponsored by: Page 6 of 15

  • 8/2/2019 SAS White Paper

    7/15

    Both the CDC and data federation markets are well established, Howard said, having already gone through the

    consolidation phase that you tend to get once products start to mature. Large vendors like IBMwhich acquired

    data integration specialist DataMirror last yearand Oraclewhich scooped up Sunopsis in 2006as well as smaller

    players like Teradata and GoldenGate Software, offer a variety of solid CDC and data federation real-time data

    integration tools, he said.

    Friedman also identified a third approach, what he calls the messaging-middleware method, in which real-time data

    integration is achieved through middleware technologies that connect applications.

    Think of IBM WebSphere MQ and Microsoft BizTalk Server, and products like that, that are really meant to do

    granular, message-oriented propagation of data, Friedman said. An application on one end spits out a message

    of something meaningful that happened, and these technologies propagate that message to another system or

    application in a low-latency fashion. So its sort of like the data replication idea, but working at the application layer

    as opposed to the database layer.

    This middleware approach is ideal for inter-enterprise scenarios, when theres a need for real-time data integration

    among organizations that may not have access to one anothers data sources, Friedman said. A vendor might

    communicate an important data change to a supplier in real time using this method, for instance.

    Real-time data integration raises data quality concerns

    Both Howard and Friedman noted, however, that while there are many benefits to real-time data integration, there

    are numerous drawbacks as wellfirst among them, poor data quality. In more traditional, batch-oriented data

    integration processes, there is ample time to scrub and cleanse data before it reaches its destination. Not so with

    real-time data integration, regardless of the method.

    In the middle of that process [batch-oriented data integration], youve got a chance to actually analyze and

    cleanse that data, Friedman said. In the world of real-time data integration, theres less opportunity to apply

    very sophisticated files for analyzing the quality and cleansing the data. There is a higher risk, then, that data

    integrated in real time will be of poorer quality, incorrect or misleading.

    Friedman said current real-time data integration tools are better at data transformation and cleansing than theyve

    been in the past, but there is still plenty of room for improvement. It is possible that someday near-perfect real-

    time data integration quality could be achieved, he said, as the problem is more technological than conceptual.

    Both analysts said it is also important to recognize that real-time data integration isnt ideal for all companies and

    organizations, and in some cases may even prove detrimental. Friedman advises users to match their data integra-

    tion method to their latency requirements. An organization that routinely analyzes certain data sets on a weekly

    basis, for example, would in that case have no need for real-time data integration, which could actually cause more

    harm than good, partly because of the already mentioned data quality concerns.

    Bolstering business intelligence with a sound data integration plan

    The three paths to real-time data integration

    Sponsored by: Page 7 of 15

  • 8/2/2019 SAS White Paper

    8/15

    Organizational structure and corporate politics also play a role in determining the appropriateness of real-time data

    integration, Friedman said. If users arent ready to accept and use real-time data, theres little point in integrating

    data in real time in the first place.

    Frankly, I know some companies that if they had real-time BI it wouldnt matter at all because the way they make

    decisions, the culture and the politics of the organization are not set up for them to act on real-time information,

    Friedman said. I think thats a limiting factor for many organizations today.

    Howard agreed, pointing to what he called decision-making latency.

    How soon can you as a human being make a decision based on new information that youre given? If you have to

    have a meeting with five other people and it takes two days to arrange that, or even two hours to arrange that,

    then you dont need real-time [data integration], Howard said.

    He added: If you can make a decision instantly Ah, this has happened, therefore I know to do such-and-such

    then thats where real-time decision making becomes important.

    Bolstering business intelligence with a sound data integration plan

    The three paths to real-time data integration

    Sponsored by: Page 8 of 15

  • 8/2/2019 SAS White Paper

    9/15

    They find it hard to step past the obvious. But you can. With proven business intelligence and analytic software from SAS.

    www.sas.com/chickens

    SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. indicates USA registration. Other brand and product names are trademarks of their respective companies. 2007 SAS Institute Inc. All rights reserved. 00000US

    Chickens are hypnotized by drawing a line in the dirt over and over.

  • 8/2/2019 SAS White Paper

    10/15

    Gartner data integration Magic Quadrant 2007:

    Platforms, market expand

    By Hannah Smalltree, SearchDataManagement.com Senior Site Editor

    Data integration buyers now have many choices of vendors and platformsbut theyre increasingly interested in

    one method in particular, according to a co-author of Gartner Inc.s Magic Quadrant.

    The market continues to experience a dichotomy, with a changing cast of vendors delivering data integration plat-

    forms, according to Ted Friedman, vice president and distinguished analyst with the Stamford, Conn.-based analyst

    firm and co-author of the study. Big vendors are getting bigger through acquisition, as evidenced by IBMs purchase

    of Cognos and SAPs acquisition of Business Objects. But other, smaller, pure-play vendors are sprouting up, too.

    There are additions to the study this yearWoodcliff Lake, N.J.-based Syncsort Inc. and Hummingbird Connectivity,

    a division of Waterloo, Ontario-based Open Text Corp., as well as a host of other vendors that didnt meet the inclu-

    sion criteria but still added to the markets growth.

    Buyers are demanding data integration platforms that support a range of different styles of data delivery, a trend

    noted in last years study and mirrored in this years inclusion criteria. The study looked only at platforms that

    support ETL and at least one other integration style, like real-time data replication or data federation, Friedman

    said. Platforms are converging at a deeper technology level as well. That means theyre based on common design

    tooling, metadata and runtime architectures, according to the study.

    The data integration method garnering more interest now is real-time data integration, particularly change data

    capture (CDC), Friedman said. CDC tools help companies recognize and capture important changes that are occur-

    ring in one data source and then propagate that changed data to another target. IBM talked up that feature whenit acquired Data Mirror, and Friedman has often been asked about it recently.

    The terminology thats emerging is event-driven approaches to data integration, Friedman said. Its growing

    fairly substantially in popularity. Real-time, operational business intelligence is pushing some of that interestbut

    more than that, people are interested in the technology because it enables a more time-effective, low-latency way

    of synchronizing data across different systems and databases.

    Theres increased interest in data federation, too, though Friedman said inquiries about it are a distant third to

    questions about ETL or event-driven approaches. And buyers are also asking about open-source data integration,

    which in the longer term could affect the market in a fairly dramatic way, he said. Though hes not seeing sub-

    stantial use of open source data integration tools on an enterprise-level scale, he said, if the freely downloadedtools mature to offer more functionality, commercial vendors may begin to feel some pricing pressure.

    Bolstering business intelligence with a sound data integration plan

    Gartner data integration Magic Quadrant 2007:

    Platforms, market expand

    Sponsored by: Page 10 of 15

  • 8/2/2019 SAS White Paper

    11/15

    Enterprise data integration tool vendor rankings 2007: Leaders lead, others

    jockey for position

    The leaders quadrant of the study, for vendors with high marks for vision and execution, hosted IBM andInformatica. Both were in positions similar to those of last year, thanks to continuing to meet market demands with

    versatile platforms, Friedman said.

    Business Objects, Microsoft and Oracle all appeared in the challengers quadrant, for vendors with less vision but

    strong customer execution. The latter two have vision challenges, Friedman said. While their position as challengers

    is largely a result of increased customer adoption of SQL Server Integration Services and Oracle Data Integrator, he

    said, these vendors need to do more to articulate a comprehensive vision.

    The visionaries quadrant, for vendors with good vision but not as much execution experience, had just two vendors

    this year. iWay, a subsidiary of business intelligence vendor Information Builders, made it into this section thanks to

    its breadth of integration capabilities and vision, particularly around the role of data integration in service-orientedarchitectures, Friedman said. SAS Institute was also in the visionaries quadrant with a strong, versatile platform.

    Most vendors were clustered in the niche quadrant, reserved for those not as strong on vision and execution but which

    still met the studys inclusion criteria. Near the top of the section was Pervasive Software, which dropped back a bit

    from its position in the challengers section last year. Pervasive has a strong product with a mix of capabilities, and the

    cost-to-functionality ratio is good, Friedman said, but the vendor didnt have enough enterprise-class implementations

    or execution experience to be in the challengers quadrant this year. Others in the niche quadrant were Cognos,

    Sybase, Tibco Software, SAP, ETI, Sun, Pitney Bowes Group 1 Software, Syncsort, and Hummingbird Connectivity.

    Friedman is interested to see how SAPs acquisition of Business Objects affects the space, since both vendors have

    integration products and little is known yet about how they may rationalize these capabilities.

    Advice for enterprise data integration platform purchasers

    Buyers should definitely look at data integration platforms with breadth or many integration capabilities, Friedman

    saidand should think beyond ETL for a data warehouse.

    You can get a lot of value out of these tools for a range of different scenarios, he said, not just ETL for the ware-

    house, but consolidations, conversions, application integration and more.

    Organizations should also look for strong metadata management and modeling functions, Friedman said. Basic inte-

    gration capabilities may become commoditized in the long term, he said, so organizations should align with vendors

    that can help them truly understand data assets, relationships between data, and how data changes over time.

    Finally, buyers should seriously think about the importance of data and data integration in service-oriented architec-

    tures (SOA), Friedman said.

    Consider how these types of tools can be used to help facilitate and add value in your SOA work, he said. We

    find thats a concept that too few organizations are thinking about today.

    Bolstering business intelligence with a sound data integration plan

    Gartner data integration Magic Quadrant 2007:

    Platforms, market expand

    Sponsored by: Page 11 of 15

  • 8/2/2019 SAS White Paper

    12/15

    Need to turn your inormation stampede into a logical

    path or success? You can.

    Find your direction with assets like this using the SAS Platorm Pathfnder.

    www.sas.com/pathfnder1

  • 8/2/2019 SAS White Paper

    13/15

    Data integration software vs. hand coding:

    Balancing costs and benefits

    By Hannah Smalltree, SearchDataManagement.com Senior Site Editor

    As organizations consider investing in data integration software, they encounter a familiar question: When and why

    does it make sense to buy technology to do something that you could do manually?

    Organizations have long relied on in-house developers to hand-code scripts for data integration projects. Yet, for

    almost 20 years, theyve had alternatives in the form of extract, transform and load (ETL) tools and data integra-

    tion platforms. But many companies still use hand coding for data integration projectsincluding, until recently,

    Canadian retailer Home Hardware Stores Ltd.

    Two years ago, when Home Hardware was upgrading its merchandising system, a consultant encouraged the com-

    pany to consider data integration software, according to Will Buddell, senior systems analyst with the St. Jacobs,

    Ont.-based retailer. The hardware, building materials and furniture retailer had to merge its product database with

    an acquired companys product database. With about 100,000 different products, combining two Oracle databases

    was no small integration task.

    Though Home Hardware had the in-house skills necessary to manually hand-code scripts, data integration software

    would provide productivity gains, according to the consultant. The team zeroed in on the Data Integrator software

    from Austin, Texas-based Pervasive Software Inc. But first, Home Hardware ran a comparison test.

    We had an internal development team that produced a test XML file. Time on the books for them to produce it was

    about 20 hours, Buddell said. With [Data Integrator], we produced the same XML file in about four hours. Weused that as the basis to go forward, because it showed that we could reduce our development time drastically.

    Hand coding wouldnt have been difficult, Buddell saidjust time-consuming. In contrast, it took only a day or so to

    set up the Data Integrator software. The pre-built database connectors and a drag-and-drop interface for source-to-

    target mapping made it especially user-friendly, he said. When it was actually time to merge the data for phase one

    of the project, it only took about two hours. Integration for other phases of the project sometimes took longer, but

    not as long as hand coding would have taken, Buddell said. Plus, time is money. Home Hardware estimates that the

    software helped it save anywhere from $200,000 to $500,000 (Canadian dollars) during the project, he said.

    Why use data integration software?

    Productivity gains are a big reason that many companies use data integration software, according to Rick Sherman,

    founder of Stow, Mass.-based consulting firm Athena IT Solutions. In addition to saving time, many common,

    repeatable functions are coded into the software, so that companies dont have to reinvent the wheel every time

    they do an integration project. Thats not all.

    Bolstering business intelligence with a sound data integration plan

    Data integration software vs. hand coding:

    Balancing costs and benefits

    Sponsored by: Page 13 of 15

  • 8/2/2019 SAS White Paper

    14/15

    An ETL tool or data integration platform means that you have all of the code in one place, documented and man-

    aged for version control, Sherman said. With hand coding, you have code all over the place, little management

    and little documentation.

    Thats a problem when companies need to replicate integration procedures, troubleshoot or change code, he said.

    And then there are issues like auditability and transparency, since regulations like SOX or HIPAA demand that com-

    panies be able to trace data back to their original source systems. Data integration software helps by automatically

    documenting information about data sources and transformations, making for an easier audit trail.

    Many still dont see the benefits

    However, many companies still dont use data integration software, Sherman said. His experiences as a consultant

    indicate that while Fortune 500 companies often use data integration software for a large data warehouse, its

    common for them to revert to hand coding for data marts, operational data stores or other projects.

    The top inhibitor is likely cost, Sherman said. While available ETL tools run the functionality and pricing gamut, a

    robust integration platform can be a six-figure proposition for a large company, he explained. Thats because the

    scale of the project affects the price. Data integration software is often either priced by concurrent CPU usage or by

    the number of source and target systems. Beyond licensing costs, training is often required for platforms, he said.

    Add this all up and many companies see in-house development as the path of least resistance.

    Theres seemingly no cost to coding, just the person you have doing it, Sherman said. But, when that program-

    mer leaveswhos going to maintain that persons code? If you worry about what you would do, you probably need

    an ETL tool.

    Justifying the cost of data integration software

    Certain systems may justify the use of data integration software more than others, Sherman added. While a

    one-time data aggregation project may not require software, any system updated or accessed frequently is a good

    candidate. For example, a business intelligence system, with data loaded on a regular basis, could benefit from the

    productivity gains of software. Systems accessed by many business users, or applications that will be expanded

    over time to encompass more data sources, may also benefit. Additionally, any sort of financial reporting system,

    especially those supporting SOX compliance, has good reasons to rely on data integration software.

    The cost justification for data integration software is often more qualitative than quantitative, Sherman said. Theres

    not a straightforward ROI equation. But the soft benefitsespecially around auditability and transparencyare

    often compelling enough on their own to convince CFOs and CIOs to sign off on purchases, Sherman said.

    Bolstering business intelligence with a sound data integration plan

    Data integration software vs. hand coding:

    Balancing costs and benefits

    Sponsored by: Page 14 of 15

  • 8/2/2019 SAS White Paper

    15/15

    Resources from SAS

    Report: Competing on Analytics

    Want to learn how specific organizations are using analytics? You can. Derived from summary reports of the hugely

    successful 2007 Competing on Analytics event tour, this compilation features analytics expert, Thomas H. Davenport

    and highlights his groundbreaking findings. Pick up best practices and experiences from executives on how their

    organizations are building effective competitive strategies around data driven insights and trumping their rivals.

    Online Assessment: Platform Pathfinder

    Do you want to turn your information stampede into a logical path for success? You can. SAS invites you to visit our

    Platform Pathfinderan innovative Web site that provides an opportunity to determine how your organization man-

    ages and uses information.

    Your journey includes:

    A five-minute Business Intelligence Assessment survey

    An objective look at the current roles of analytics, business intelligence and data integration in your

    organization.

    Learning resources tailored to your survey responses, including white papers, Webcasts and product

    demos.

    SAS Success Stories

    Discover more about the organizations who use SAS. You can. SAS customers represent many of the most

    innovative and successful organizations in the world, including 96 of the top 100 companies on the 2007 FORTUNE

    Global 500 List.

    About SAS

    SAS is the leader in business intelligence and analytical software and services. Customers at 43,000 sites use SAS

    software to improve performance through insight from data, resulting in faster, more accurate business decisions;

    more profitable relationships with customers and suppliers; compliance with governmental regulations; research

    breakthroughs; and better products and processes. Only SAS offers leading data integration, storage, analytics and

    business intelligence applications within a comprehensive enterprise intelligence platform. Since 1976, SAS has

    been giving customers around the world The Power to Know.

    http://www.sas.com/

    Bolstering business intelligence with a sound data integration plan

    Resources from SAS

    Sponsored by: Page 15 of 15

    http://ad.doubleclick.net/clk;194277963;12505953;r?http://www.bitpipe.com/detail/RES/1203350461_341.htmlhttp://ad.doubleclick.net/clk;194278033;12505953;g?http://www.sas.com/pathfinder1http://ad.doubleclick.net/clk;194278195;12505953;p?http://www.sas.com/successhttp://ad.doubleclick.net/clk;194278221;12505953;f?http://www.sas.com/http://ad.doubleclick.net/clk;194277963;12505953;r?http://www.bitpipe.com/detail/RES/1203350461_341.htmlhttp://ad.doubleclick.net/clk;194278033;12505953;g?http://www.sas.com/pathfinder1http://ad.doubleclick.net/clk;194278195;12505953;p?http://www.sas.com/successhttp://ad.doubleclick.net/clk;194278221;12505953;f?http://www.sas.com/