Collaborative Articles in Data Analytics
LinkedIn provides community knowledge in data analytics based on insights and advice from people with real-life experiences.
LinkedIn provides community knowledge in data analytics based on insights and advice from people with real-life experiences.
Contributions include:
- Business Decisions
- What are the best ways to ensure your data analytics projects align with company values?
- What is the best way to prioritize data quality issues?
- What are the best strategies for avoiding scope creep in a data analytics project?
- How can you align data analytics with your business goals?
- How can you evaluate the ROI of using emerging technologies in data analysis frameworks?
- Data Techniques
- How does structured data modeling differ from unstructured data modeling?
- What are the limitations of using predictive models to optimize product and service design?
- How can data analysts be more adaptable in the face of new technology?
- What are the benefits of performing data quality checks in your analytics process?
- Data Tools
- How can you compare data analytics tools like Tableau, Power BI, and Google Analytics?
What are the benefits of performing data quality checks in your analytics process?
Data quality checks in your analytics process are essential to ensuring that a data asset meets its intended purpose for business decision-making, customer experiences, or other points of integration in a reliable and timely manner.
In my experience, we leveraged automated processes and manual reviews to check data quality to deliver a great CX. Examples include:
- Manually inspect data to uncover technical issues in in-app personalization - before customers do
- Automatically de-duplicate product ads on a single page
- Automatically reclassify images until two separate reviewers agree on the data label
- Automatically alert the technology team when the platform fails to perform as expected
- Manually review ads with inflated metrics
How can data analysts be more adaptable in the face of new technology?
Data analysts can be more adaptable in the face of new technology by:
- Solving problems for business and customers
- Collaborating with peers and architects
- Experimenting with new technologies
- Keeping up-to-date with industry trends
- Continuously learning new skills and techniques
- Embracing change and innovation
- Retiring obsolete technologies
At my market research firm, my product's data quality started out so poorly that we solicited cross-functional help to improve it. After clearly defining our problem and goals, our Chief Research Officer introduced me to Amazon Web Services' crowdsourcing platform. After experimenting with this technology, we created AI models to deliver market insights with 95%+ accuracy/80%+ data coverage.
How can you evaluate the ROI of using emerging technologies in data analysis frameworks?
You can evaluate the ROI of using emerging technologies in data analysis frameworks by using holdout tests to measure the value of using the technology for people vs. the value of not using it in a holdout group.
Our enterprise platform used machine learning, an emerging technology, for predictive modeling to personalize in-app content for customers. As we scaled our platform, we needed to demonstrate our service's value to the business. We mapped our campaign metrics to business goals of revenue generation and cost savings. We then used holdout tests to compare the revenue lift and cost savings when using the technology vs. not using it. By demonstrating ROI, we gained support to scale our services across apps, placements, and campaigns.
How can you align data analytics with your business goals?
You can align analytics with your business goals by making good decisions about how and why data will be used.
Effective analytics decisions require:
- Business strategy: business case, economic viability, planning, budgeting
- Customer centricity: customer problem definition, design thinking
- Continuous exploration: MVP + build, measure & learn
- Empirical milestones: incremental development with customer feedback
In my experience, proving upfront that analytics investments will deliver beneficial business outcomes is a challenge. A lean business case, experiments, prototypes, or pilots helped gain leadership approval & stakeholder alignment. Customer centricity helped secure technical support for a solution that evolved over time.
How can you compare data analytics tools like Tableau, Power BI, and Google Analytics?
Tableau, Power BI, and Google Analytics are tools that allow you to analyze, visualize, and share data.
At work, the tool I use depends on the source of the data I want to analyze:
- Tableau: a wide variety of sources, such as a relational database (data warehouse) or data lake (text files)
- Google Analytics: traffic data and user behavior for websites
- Power BI: data from Microsoft applications, such as Excel, Outlook, or SQL Server
Although Google Analytics focuses more on pre-built reports to understand website users, Tableau and Power BI make it easy to create reports or visualizations through drag-and-drop interfaces. Tableau and Power BI are paid services with a free trial. Google Analytics offers a free and paid version.
What are the limitations of using predictive models to optimize product and service design?
Although AI-powered platforms help companies optimize product/service design at scale, there are limitations to using predictive models to personalize the user experience (UX), including:
- Reliance on historical data that may not reflect future preferences
- Incomplete consideration for human behavior & decision-making
- Privacy concerns
- Lack of transparency
- Technical challenges for system implementations
- Inaccurate models due to bias or data errors
Organizations must effectively manage risk to ensure that predictive models augment and inform product/service design. At my financial services firm, I authored a white paper on enterprise model risk to ensure that our models brought "humanity, ingenuity & simplicity" to the banking UX.
What are the best strategies for avoiding scope creep in a data analytics project?
The best strategies for avoiding scope creep in a data analytics project are to consider the Pareto Principle (80/20 rule) and apply traditional project management frameworks.
The 80/20 rule states that 80% outputs result from 20% inputs. Prioritizing work on the most impactful inputs before the 80% long-tail inputs that only impact 20% outputs, helps prevent scope creep in data projects.
For example, my data science & operations team focused 1st on accurately classifying the 20% ads that covered 80% industry spend before expanding to the long-tail in my advertising research service.
Traditional practices include:
- Project planning
- Change management
- Stakeholder updates
- Value-based prioritization
- Continuous monitoring & learning
How does structured data modeling differ from unstructured data modeling?
Using structured vs. unstructured data for analysis in AI models depends on the intended use case and data format.
Structured Data:
- Use Cases: making predictions based on customer/business data
- Format: highly organized (predefined schema)
- Storage: relational databases
Unstructured Data:
- Use case: insights into customer behavior, trends, or opportunities
- Format: lacks a predefined format (no schema)
- Storage: Data lakes to store text, images, videos, social posts, etc.
For example, my teams applied:
- Unstructured data techniques to tag ad images with the advertised product to model market insights & trends.
- Structured data modeling to predict the highest value message based on customer data when personalizing banking app CX
What is the best way to prioritize data quality issues?
The best way to prioritize data quality issues is to operationalize requests through a priority ticket system.
When a customer, stakeholder, or team member identifies a data issue, they need to know how it will be handled and how quickly to expect a response.
Product and data teams need to triage and fulfill requests based on predefined workflows and service-level agreements to ensure issue resolutions and response time consistency. Ranking data issues can depend on a number of factors, including:
- Customer impact
- Requestor
- Category
- Impact on the business
At my market research firm, we prioritized data issues based customer impact and we used artificial intelligence to automatically identify and correct the majority of issues.
What are the best ways to ensure your data analytics projects align with company values?
The best ways to ensure your data analytics project aligns with company values include:
- Align practices & metrics with values
- Communicate insights & value to stakeholders
- Own CX & connect to business objectives
- Embrace transparency & feedback for continuous improvement
For example, we aligned our data platform with the following company values:
- Integrity & respect for customers: privacy/compliance reviews for customer data
- Teamwork: cross-functional collaboration with tech, ops, analytics & business
- Openness: provide API & dashboard access to business
- Excellence: evaluated ROI & authored white paper
- Innovation: first major platform to migrate to cloud
- Inclusion: provided services for all business & customer segments