A marketing strategy is a document that answers the big questions—what you offer, who your audience is, what your company stands for, brand guidelines, what niche or industry you play in, and who your main competitors are. It can include less or more information than that, but its purpose is to be a sort of lighthouse, helping you keep your marketing focused on the bigger picture.
Marketing strategies are always rooted in what your overarching business goals are. If your business needs to increase revenue by 25%, then your marketing strategy would explain how marketing will help achieve those goals from a high-level point of view.
When writing a marketing strategy, do not—I repeat—DO NOT go into uber-specific marketing tactics. That’s for later. On the flip side, you should be asking the right kinds of questions.
What Is a Marketing Plan?
Your marketing plan, based on your marketing strategy, dives into the specifics—what channels and tactics you’re using, which segment of the audience you’re targeting, when initiatives are occurring, and how you’ll measure success.
Compared to your marketing strategy, your marketing plan is more about which tactics, campaigns, initiatives, and promotions you’ll run in a certain amount of time. Marketing plans can range from three months to a year, and should remain semi-flexible in case marketing needs to shift priorities.
MARKETING STRATEGY VS. MARKETING PLAN
To help visualize all of this, we’ve created a graphic that explains the core differences between a marketing strategy and a marketing plan.
In our experience, the marketing strategy shouldn’t change as much as your marketing plan. If it does, it’s almost impossible to gain any real traction or measure the impact of your marketing on the business. That said, you’ll probably need to revisit your marketing strategy each year, whereas you’ll be executing against your marketing plan regularly.
At this point, you might be thinking, “this seems like A LOT.” It is, which is why strategies and plans are often broken down into smaller disciplines within marketing (think: content, customer acquisition, customer retention, and more). Let’s look at an example.
An Example in Action: Content Marketing
Most marketers are familiar with content marketing, so let’s use it as an example. Like I mentioned earlier, strategies address high-level issues whereas plans show how you’re going to execute.
CONTENT MARKETING STRATEGY
In this case, your content marketing strategy would answer those big questions discussed above. Questions like:
Who should we be writing to?
How will content marketing support our business goals?
How much money can we spend on creating content?
Who are our content competitors?
What tone do we want to come across in our content?
The role of the content marketing strategy isn’t to point out every single topic you’re going to write about or even what your specific KPIs for the upcoming year are going to be. Instead, it’s supposed to give context to why you’re investing in content marketing, who you’re targeting, what content mediums you’re going to use (generally), and more.
CONTENT MARKETING PLAN
Once you have your strategy set, your content marketing plan should explain how you’re going to attack the problem. This is where you’ll document content marketing goals and KPIs, what resources you plan on creating, when you plan on creating them, who will be involved in the process, how you’ll be promoting each piece of content, and more.
There’s more to content marketing plans than an editorial calendar, but your editorial calendar is a huge part of your plan. It tells everyone what topics your content is addressing, where it’s being published, when it’s being published, which channels you’ll promote it through, and the goals it should be aiming to hit.
Tips for Creating a Stellar Marketing Strategy or Plan
Whether you’re focusing on the overarching strategy or getting into the fine details, there are a few things you need to keep in mind throughout the journey that will help produce something of real value for your organization or team.
TIP #1: DOCUMENT YOUR PLAN/STRATEGY
This sounds obvious, but it is oh-so important. Enter impactful stat—marketers who document strategy are 538% more likely to report success than those who don’t. That’s a real number. Write your plan down. Even if you don’t think it’s 100% correct.
As marketers, we want so desperately to get everything right that it often has the opposite effect—we’re paralyzed. You can go back and adjust your plan or strategy as necessary, but it starts with actually having a plan or strategy. Plus, without a documented vision for the future, it’s much harder to gain buy-in from your co-workers and miscommunication ensues.
TIP #2: COLLABORATE WITH OTHERS
You don’t have to do this alone. Nor should you. Even if you’re legitimately the only person in your marketing department, you can ask your marketing friends at different companies for some insight. If all your friends sadly work in sales (jk, jk) you can even gain some perspective from co-workers in different departments. They might not give you the best “marketing” ideas, but they might ask good questions and prove to be good sounding boards.
There are tons of positives when it comes to collaborating with others. A few I like are gaining new perspectives, considering other ways to solve the problem at hand, gaining feedback, and the efficiencies that come with asking people that may have done something similar in the past.
TIP #3: MAKE IT CLEAR. MAKE IT REALISTIC.
There’s no problem shooting for the moon. But if you create a marketing plan or strategy that requires 10 people to execute and you only have two sad interns, then it’s not all that helpful. You’ll just end up with a ton of ideas but no idea which ones to work through.
A wise person once said, “The essence of strategy is choosing what not to do.” Try your best to paint a clear picture that’s based in reality.
Don’t Let the Scary Marketing “Strategies” Overwhelm You
To play on even more cliches, “A journey of a thousand miles begins with a single step.” Try your best to avoid paralysis by analysis, and instead tackle your strategy or plan one step at a time.
And if you don’t think documenting your marketing plan or strategy is important, think again. Remember: marketers who document strategy are 538%(!) more likely to report success than those who don’t.
Finally, one last reminder that for those who can't wait to dig in and start planning, we created a marketing toolkit that allows you to work through building your very own marketing strategy. Happy planning!
Courtesy of an article dated September 21, 2020 appearing in Element Three Blog
Over the last two decades of building and running businesses, and the last couple of years working full time with dozens of startup founders and CEOs on their strategies and funding plans in my consultancy business, I have observed that there are a common set of reasons that startups struggle and fail, and a consistent set of factors that make startup companies successful.
I wondered if my observations were supported by hard data, and my curiosity around startup success and failure eventually got the best of me. I decided to do some in-depth investigation around this topic. I wondered if there were any research studies that showed why startups succeed and fail? I found several articles that were filled with unsubstantiated opinions and a few sources that had really great hard research around the topic.
Why do companies fail?
According to an article in FastCompany, "Why Most Venture Backed Companies Fail," 75 percent of venture-backed startups fail. This statistic is based on a Harvard Business School study by Shikhar Ghosh. In a study by Statistic Brain, Startup Business Failure Rate by Industry, the failure rate of all U.S. companies after five years was over 50 percent, and over 70 percent after 10 years.
This study also asked company leadership the reason for business failure, giving a list of four main reasons for failure with sub-categories below those. They also gave a list of 12 leading management mistakes. It is worth checking out the details. This research-based analysis confirmed some of my observations. I bracket the Statistic Brain finding into seven key reasons for that entrepreneurs experienced business failure:
Lack of focus
Lack of motivation, commitment and passion
Too much pride, resulting in an unwillingness to see or listen
Taking advice from the wrong people
Lacking good mentorship
Lack of general and domain-specific business knowledge: finance, operations, and marketing
Raising too much money too soon
All of these focus on the decision-making of the entrepreneur and general business knowledge.
In another study, CB Insights looked at the post-mortems of 101 startups to compile a list of the Top 20 Reasons Startups Fail. The focus was on company level reasons for failure. I think this list is instructive, but each of these reasons for failure is due to a failure in leadership at some level. The top nine most significant from this study are:
No market need
Ran out of cash
Not the right team
Got outcompeted
Pricing/cost issue
Poor product
Need/lack business model
Poor marketing
Ignore customers
Notice that all of these are business- and team-related issues, even the ones that relate to the product. Issues like there are always tied to leadership and the leader’s ability to build a strong team and drive a business model and business thought process and discipline. Also, keep in mind, if running out of money is the ultimate reason for failure, there are always other factors that cause this result.
Why do startups succeed?
Next, I looked for sources of information of why businesses were successful. I found some good research from Harvard Business School, Performance Persistence in Entrepreneurship, which suggest that serial entrepreneurs that have prior success are more likely to have success, and that the best VCs are good at picking serial entrepreneurs. However, that really didn’t answer my question about the qualities of the entrepreneur.
The best comprehensive research that helped to answer the “reasons for success” question that I could find was from The Ecommerce Genome by Compass in their Startup Genome report, which looked at 650 internet startups. Although this research is tech industry specific, I still think it is very instructive. The report stated 14 indicators of success. Some of the 14 were a bit redundant, but you should review the report yourself. This analysis also confirmed some of my observations. I bracketed these 14 indicators into nine key factors for success:
Founders are driven by impact, resulting in passion and commitment
Commitment to stay the course and stick with a chosen path
Willingness to adjust, but not constantly adjusting
Patience and persistence due to the timing mismatch of expectations and reality
Willingness to observe, listen and learn
Develop the right mentoring relationships
Leadership with general and domain specific business knowledge
Implementing “Lean Startup” principles: Raising just enough money in a funding round to hit the next set of key milestones
Balance of technical and business knowledge, with necessary technical expertise in product development
Are the reasons for success the opposite of those for failure?
There are things that you must possess to be a successful entrepreneur, but they won't guarantee success. That said, it stands to reason that if you fixed the reasons for business failure, you would at least improve your chances of success. So, I decided to look at the side-by-side comparison of the reasons for failure and the factors for success.
If you look at both the reasons for failure and the factors for success, it is clear that commitment to a plan is key. This, of course, implies having a plan. This does not mean that you are completely inflexible, but you can stay the course. This is why the most successful companies have one or two pivots. I do not think that every little business adjustment or fine-tuning as a pivot.
A true pivot is a change in course of direction that results in a material change in the product-market strategy. It could be along the product axis or the market axis, but it has to be enough of a change that it really requires an adjustment in strategy and a corresponding adjustment in resource allocation. At least, that’s my definition. Passion and motivation are the obvious factors. Every entrepreneur, business coach, consultant, advisor, newscaster, investor and industry analyst talks about passion. Steve Jobs is quoted all the time about this. It’s probably become too cliché and overused at this point.
What I like about this analysis is that it goes to the root of the passion. People that are successful believe in what they are doing. The successful entrepreneur feels that they can make an impact and a difference in the world. There is so much inertia and negativity around getting a startup off the ground, much less getting it to “escape velocity,” that if you don’t have this deep-seated commitment to making an impact, you will surely give up. Successful entrepreneurs are competitive. They play to win, and they hate to lose. This trait may show-up differently with different personality types, but I have never met a successful entrepreneur that doesn’t have a competitive spirit and a will to win.
The next two things go hand-in-hand. I kept them separate since I think mentorship is so important, and it has played such a huge role in my career success. Just because you are willing to learn does not mean that you are willing to seek a mentor and listen to their guidance. By the way, I’m not advocating that you take every piece of advice and guidance from your mentors, but if you have selected strong mentors that have significant domain, technical or business expertise, you should at least consider thoughtfully consider what they have to say. Otherwise, why have them around as a mentor? It gets to humility. It’s one of those things when you think you have it, you don’t.
Successful startups are businesses. It therefore stands to reason that you need to establish and implement solid fundamental business principles and practices to improve your chances of success. Many technical founders fall in love with their product idea and consciously or unconsciously believe that if they build a better mousetrap, the world will beat a path to their door. However, both the success and failure studies show that you need leadership in the company with general and domain-specific business knowledge to be successful. Of course, you also need to have strong technical expertise in your chosen product development area.
Does this mean that a technical founder cannot be successful as a CEO? No, it doesn't. Look at Dr. Irwin Jacobs, the co-founder and founding CEO of Qualcomm, as a classic example. Dr. Jacobs is a brilliant engineer and former professor at MIT. However, he also has a brilliant business mind and a lot of business knowledge. Prior to Qualcomm, Dr. Jacobs ran another company, MA-Com, so he had experience running a company. He also surrounded himself with a strong management team. There are many other examples of this success formula, but there are far more where there is a seasoned businessperson who has domain expertise leading the company, and a strong technical team driving product development. Steve Jobs (Apple, NeXT, and Pixar) is the classic example as a business-oriented founder. Meg Whitman (eBay) and Eric Schmidt (Google) are great examples of CEOs who were brought into companies at an early stage to complement an exceptional team of technical founders.
Finally, having a clear and realistic idea of how long things take, setting intermediate milestones for every 12 to 18 months, and raising just enough money it to get to the next set of key milestones, is not only important to capital efficiency, it is also important for success.
How do I become a member of the $100 million club?
Interestingly, according to the Kauffman Institute, in its article The Constant: Companies that Matter, the pace at which the United States produces $100-million companies has been stable over the last 20 years despite changes in the economy. The study sates, “Anywhere from 125 to 250 companies per year (out of roughly 552,000 new employer firms) are founded in the United States that reach $100 million in revenues.” My former company, Entropic, achieved this status. How do you become part of that club? You need some luck and a good sense of timing. However, as said by the Roman philosopher Seneca, “Luck is what happens when preparedness meets opportunity.”
Beyond that, you need a plan, persistence, perseverance, a willingness to be flexible, and a world-class team. You also need to be frugal, bright, and cultivate strong mentors. The best way know to do all these things well and efficiently is to follow a systematic process where you plan, commit, track results, promote accomplishments and raise the necessary capital, or "fuel in the tank," to drive the growth of your startup.
Plan. Commit. Win.
COMMENTARY: As a consultant it always pains me when a startup client launches successfully and gains traction, but never seems to quite "cross the chasm" that all startups encounter, and must cross in order to "get to the next level." Crossing the chasm simply means helping a product, service or technology move from "early adopters" to a larger market segment, sometimes called the "early majority," in the Product Adoption Curve (see below).
Product Adoption Curve
The product adoption curve is a standard model that reflects who buys your products and when.
Think of it as the big picture view of your product adoption. It takes the product lifecycle and considers what happens at different points.
In most product adoption models, there are five distinct stages. Each stage represents an arbitrary amount of time, so what’s most important here is the process as a whole.
Now let’s break this down step by step, stage by stage.
Stage 1. Innovators
The innovators are the first group of people to invest in your product.
This is a unique group. People who buy super early are usually obsessed with technology and want to keep up with the cutting edge of technology. When the first Apple iPhone was first launched on July 29, 2007, the innovators were the very first to buy the iPhone.
What’s most important about the innovators group is its size. You might have noticed that it’s small. That’s completely normal.
This is why you might only get a few sales immediately after you launch. You’ll typically get about 2.5% of your total sales from innovators.
The Innovators
Stage 2. Early Adopters
At some point, you’ll see a swell in sales, and you’ll start to get a steadier conversion rate.
This is probably because the early adopters have arrived.
Like innovators, early adopters tend to be ahead of everyone else, willing to test the waters.
Early Adopters
Although early adopters are similar to innovators, there are some important differences.
It could be the case that early adopters have purposely waited to buy your product.
Whereas innovators are fine with rushing in and testing out something new, early adopters are a bit more hesitant. They still want to try something new, but they want a few reviews to consult.
Then again, it could be the case that they just found out about your product.
Expect your percentage of adoption to go up to about 13.5% or so.
Stage 3. Early Majority
Here’s when your product really gets some momentum going.
You’ve got a good amount of sales from innovators and early adopters. At this point, usually an even larger group sweeps in and gives you a heck of a lot more sales. Specifically, about 34%.
The people in the early majority are usually pragmatic and will only buy something once it’s been road-tested (at least a little bit) and has proven its value.
Early Majority
This is the beginning of your product’s peak. Maybe it’s gained traction with more marketing or word of mouth.
Stage 4. Late Majority
At stage 4, your product has been out for a while, and there’s widespread use.
However, there are still some people who are a bit skeptical of your product. Once they’ve put their worries to rest, they buy your product, and these people are usually in the late majority or laggards.
Late Majority
At some point during the early or late majority phase, you’ll have your peak where you get more sales than ever, and your product is at the height of its popularity.
Interestingly, in terms of adoption rates, the early and late majorities are usually roughly equal, around 34%.
Stage 5. Laggards
These are the people who buy your product after all the hype has died down. Sometimes, laggards purchase a product years after it’s been released.
Laggards might be extreme skeptics or people who have only heard about your product a long time after you launched it. Whatever the reason, these people don’t buy until much later in the product lifecycle.
The Laggards
Surprisingly, this is a pretty big group. 16% of your product adoption will come from laggards.
Try to wrap your head around the fact that laggards have a higher adoption rate than early adopters.
Change Your Marketing as Your Product Ages
At each stage of the product adoption curve, it’s likely there’s going to be certain demographics buying your product.
For example, innovators are more likely to buy on impulse, while buyers in the late majority will do lots of research before purchasing.
And as your product gets older, it will become more well-known. So you might start out with a product no one knows but end up with a product everyone and their brother has heard of.
Given these facts, consider changing your marketing messages as your product ages.
The Apple iPhone Marketing Messages Over Time
The marketing of each successive version of the Apple iPhone illustrates how Apple changed its marketing message to appeal to innovators, early adopters, early majority, late majority and laggards.
A commercial for iPhone 2showed off a lot of the hip new features: music, email, and Internet browsing, to name a few.
iPhone 2 TV Commercial
This obviously appealed to a younger, more tech-savvy audience.
Then in 2010, three years after the first iPhone launched in 2007, the iPhone 4 came out with a commercial that featured two grandparents celebrating their granddaughter’s graduation:
iPhone 4 TV Commercial
Apple wanted to show that even grandparents (who may not have understood smartphones back in 2010) could benefit from the iPhone. This is important because older consumers are typically late adopters.
Apple’s strategy was clear: Begin by showcasing all the bells and whistles, then open up the audience to include more types of customers.
In the same way, you should think about what your marketing should look like at each stage of the product adoption curve.
For example, when the innovators and early adopters come rolling in, your marketing should clearly describe the value and benefits of your product.
Later on, perhaps in the late majority stage, you can utilize customer testimonials and reviews. This can help address the skepticism that later adopters typically have.
Think about addressing the common questions that each group has, innovators will ask themselves what’s so unique about your product, while the early majority wants to know what other people think about your product and why it’s useful.
Thinking like this can completely change your marketing. By sending a customized message every step of the way, you’ll battle objections and questions head-on.
Know How to Overcome The Chasm
In most product adoption curves, there’s a point that can make or break the success of the product.
It’s called the chasm. It’s the point between the early adopterstage and the early majority stage.
The Chasm
As the chart above represents, crossing the chasm means breaking into the mainstream market. It’s one of the most difficult aspects of product adoption, but it’s one of the most important aspects to get right. There’s even a bestselling book on the topic––Crossing the Chasm.
Crossing the chasm is particularly tough to do for a few reasons. One reason is that as your product ages and grows, your audience will have higher expectations. Specifically, your potential customers will want increasingly better reasons to buy your product. You have to be ready to meet these demands throughout your product’s lifecycle, but it’s especially important in getting past the chasm.
As impulse buyers, the innovators and early adopters didn’t need huge reasons to buy your product. But to get the early majority to convert, that’s exactly what you’ll need. You have to think about your branding and not just your product. You have to offer value and not just features.
Another reason for the difficulty is the possible necessity of pivoting. In other words, to cross the chasm you may need to take a new angle for your campaign. Early on, you may be hedging on the idea behind your product. Early adopters are cool with that, but the early majority wants consistency. In other words, to cross the chasm you may need to take a new angle for your campaign. Early on, you may be hedging on the idea behind your product. Early adopters are cool with that, but the early majority wants consistency.
The Chasm
If you’re at the chasm right now, you might need to pivot yourself or even improve your product.
Don’t Forget The Laggards
You can’t stop after your product has hit its pinnacle and is riding the waves of success. It's important to remember, the second largest adoption group is laggards, coming in at 16%. A lot of people will be buying your product well after the hype dies down, and you can’t forget or alienate this audience.
Laggards are often skeptics, so at the end of your product lifecycle, your marketing should be laser-focused on overcoming objections. Think about it––you’re marketing to people who resist change and may not even want to be a customer. They’re going to need awesome reasons to invest in your brand. (A slew of positive testimonials, reviews, and press mentions will come in handy for this.)
Time also plays an important role. Think back to the iPhone example; sure, older folks are commonly seen with iPhones, but it’s been a decade since the device’s initial release. It might take a lot of time and exposure to your brand for laggards to adopt your brand.
Finally, you’ll also need to brace for the declining sales that inevitably occur at the end of the product life cycle. If your brand is experiencing one or more of these symptoms (see below) listed in the Product Life Cycle chart, its time to evaluate whether you can extend its life by introducing an improved version, replace the product with an entirely new product or dump the brand or line entirely.
Product Life Cycle
Courtesy of an article appearing in September 2014 issue of Entrepreneur and an article dated October 23, 2017 appearing in The Daily Egg and an article dated October 23, 2017 appearing in The Daily Egg
Digitally savvy consumers know there’s an abundance of choices when it comes to purchases. With high expectations, most will seek out appealing items with little regard to brand loyalty. Churn and attrition are at an all-time high.
The response for organizations sounds simple enough: Provide a consistently good, engaging customer experience, optimize it on a variety of devices and deliver it when customers want it. Why has it been so hard for organizations to do this?
The answer starts with the way companies operate on the back end. With multiple organizational silos, no online/offline data synthesis, rigid customer databases and other inflexible legacy systems, organizations only have a piecemeal view of the customer. It’s hard to take advantage of all the existing corporate customer data that’s available, much less the rich variety of external data. As a result, marketing efforts are fragmented. Communications are inconsistent and ineffective. And revenue growth is hindered.
By taking a technological approach that synchronizes marketing processes with the customer journey across multiple channels, organizations can achieve great results – in terms of revenue, customer advocacy and loyalty. First, they need to get a panoramic view of each customer. Then they can understand and anticipate customer behavior; orchestrate the next best action across any channel; and accurately measure results to inform future actions.
SAS recommends that you connect your marketing efforts with all the relevant data from customer interactions as well as back-end operations. Then, through advanced customer and marketing analytics, you can deliver an integrated, omnichannel experience and truly compelling content. By responding to your customers on their terms – right content, right time, right device – you can keep them coming back for more and raise their value to your business.
Step 1: Synchronize Marketing Processes Based on a Comprehensive Understanding of the Customer
When marketing departments, call centers, service operations and merchandisers operate independently based on their own distinct views of the customer, both customer engagement and marketing efforts suffer.
Consider a scenario where a customer’s browsing history (showing his preferences or inferred interests) is in one database while offline point-of-sale data about the customer is in another database. If these databases are not connected, there’s a good chance you will have less relevant interactions with that customer than what the customer expects. Or you may see the “echo effect,” where you reach the customer through one channel but he responds through a different one – leaving you unsure how to attribute the response or plan your next offer.
Many organizations don’t use their existing corporate data to the fullest extent. They overlook opportunities to enrich customer data with information from service records, operations or contact centers. Many also fail to use external data sufficiently, missing chances to broaden their understanding of the customer with data from social media, open data, third-party data, etc. In an ad sales scenario, these proprietary data sets can present a unique differentiator in the marketplace and enable you to create highly targeted campaigns for your advertisers.
SAS Customer Intelligence solutions provide a panoramic view of the customer by consolidating all first-, second- and third-party data. From digital data to CRM information and call center records, SAS captures, integrates and transforms disparate data sources, breaking down multiple customer data silos. Built-in data management capabilities ensure that you can use your data effectively to engage customers, and boost ad sales. Use SAS to:
Improve data quality where the data resides, regardless of whether it’s in a marketing or operational system. SAS profiles, standardizes, monitors and verifies data without moving it, which creates significantly faster, more secure processes. So you can speed up many marketing processes to run in real time and near-real time instead of weeks and months.
Access the data you need, no matter where it’s stored – from legacy systems to Hadoop. You can create data management rules once and reuse them, for a standard, repeatable method of improving and integrating data – without additional costs.
Be confident that your data is reliable and ready to use for analytics, whether you’re doing segmentation, content recommendations, next best offer, retention or lifetime value scores.
Create a panoramic view of the subscriber that connects all touch points, contact history and online/offline interactions.
Step 2: Understand Customer Behavior and Fuel Content Engagement
Content is core to enticing and keeping consumers. You can attract the right customers by optimizing your content. But it’s just as important to optimize the customer’s overall experience. Using advanced techniques like text and predictive analytics, you can improve search engine optimization (SEO) for digital content, quickly categorizing content and text mining words, phrases and topics for customers.
Beyond SEO, you can profile and segment customers based on their historical behavior, profitability and lifetime value. Through a range of predictive analytic models, including affinity analysis, response modeling and churn analysis, you’ll know whether it’s a good move to combine digital and print subscriptions. You’ll recognize which content merits a fee versus which content you can monetize without a paywall.
To keep your marketing efforts fresh, you’ll need to continually supply models with updated data as you interact with customers and prospects. For example, your models should include purchase transaction data, online data from website users, direct marketing response data and more.
Figure 1 - Decision Tree to quickly idenify variables that can best predict iPad usage and high versus low user populations (Click Image To Enlarge)
Through advanced analytics, you can use these models to predict behavior and:
Identify how different customer segments are most likely to respond to specific content, campaigns or marketing actions. Your approach will be based on analytically driven, granular segmentation of both known and unknown customers.
Reach the target population that’s most likely to respond positively to certain content, campaigns and other marketing activities. With predictive modeling, you can understand and predict the behavior of each targeted group.
Improve economic outcomes using optimization to make the most of each individual customer communication. Take into account resource and budget constraints, contact policies, the likelihood of customers responding, and more.
Step 3: Automate and Synchronize Customer Engagement Across Channels
Once you’ve determined which analytics approach is best, you’ll need to automate your engagement activities with customers. SAS Marketing Automation helps you to quickly define target segments, prioritize selection rules, choose appropriate communication channels, schedule and execute campaigns, analyze results, and make adjustments to improve future campaign performance.
Use SAS to orchestrate data-driven marketing activities across all of your channels. So you’ll be able to present customers with the best, most profitable offers to keep them engaged or to win them back from competitors. Analyze – in real time – how people get to your site and what they do while there. Then present them with engaging content at precisely the right moment. Use SAS to:
Build an omnichannel marketing environment so you can align outbound and inbound marketing tactics across all channels.
Develop event-triggered campaign tactics to ensure timely, relevant marketing strategies.
Know the next best action to take for each customer by incorporating analytics into your marketing execution efforts.
Track the effectiveness of all marketing activities and monitor campaign results in real time.
Reduce your reliance on IT for campaign creation and deployment with an easy-to-use interface.
With a complete view of the customer, a deep understanding of behavior and automated engagement efforts, you’ll be able to make decisions that resonate for customers and invigorate your marketing efforts. For example, if you know a customer checks email every Friday, you’ll send her an email on Friday – because you’ll know that’s the best way to reach her. You’ll also be able to decipher between premium content versus content that should be free. You’ll know what will hook your customers, whether they’re using your services for the second time or the hundredth time.
Today’s customers demand value and expect a consistent experience regardless of the channel or device they’re using. SAS positions you to meet these ultra-high customer expectations at every touch point.
Figure 2 - A marketing campaign response measurement dashboard (Click Image To Enlarge)
Step 4: Effectively Measure Campaign Performance and Attribution
It’s hard to understate the importance of accurate, useful measurement. Combining SAS Reporting capabilities with SAS Visual Analytics – a visualization and exploration suite built to handle big data – it’s easy to examine the effectiveness of your marketing campaigns and tactics based on your budget and success metrics. Use response attribution modeling to understand the customer’s conversion path, and to know where to assign marketing credit. Then you can create future marketing mix optimization models, test/control strategies, predictive models and marketing campaigns.
With adaptive, agile marketing, you can test your offers and content quickly, on a small scale, and nurture continually richer customer interactions. Then get rapid feedback to show you when and how to modify the customer’s experience to get the most impact. Plus, you’ll have easy access to campaign reports and dashboards so you can track and manage campaigns across all of your channels.
Figure 3 - Campaign and offer performance reports are integrated with revenue metrics and demographic indicators (Click Image To Enlarge)
COMMENTARY:
A New Definition of Data-Driven Marketing
What is data-driven marketing, how can event marketers effectively use it to drive conversions, and why does it matter? For decades marketers were forced to launch campaigns while blindly relying on gut instinct and hoping for the best. That all changed with the digitization of business and an increasingly demanding and digitally connected consumer. Now more than ever, there is a greater urgency to develop data-driven marketing campaigns as organizations have come under increasing pressure to deliver results or ROI for their marketing spend. To be successful in this landscape, a modern marketing campaign must integrate a range of intelligent approaches to identify customers, segment, measure results, analyze data and build upon feedback in real time.
While almost every area in marketing has been folded into the digital marketing ecosystem, in-person events have remained elusive to today’s modern marketer. In fact, when it comes to tracking your marketing efforts and determining which channels provide the best return on investment (ROI), most marketers will agree that results from in-person events are still difficult to track:
69% of marketers say that tracking ROI for events is their primary challenge. (Aberdeen Group)
Only 48% of marketers report having any event ROI metric in place (Regalix)
82% of marketers cannot quantify the data received from attendee interactions at their corporate events (Kissmetrics)
Indeed, events often lag behind other marketing methods by a significant gap, with the success or failure of many events based solely on anecdotal evidence instead of quantitative measurement and logic.
Furthermore, because data-driven marketing produces highly personalized, engagement-focused campaigns for everything from enterprise servers to event apps, consumers are now beginning to expect a high level of personalization with each transaction.
What is data-driven marketing?
Let’s start out by trying to develop a simple definition for a relatively complex concept and practice. Data-driven marketing captures insights and data from a prospect, analyzes and scores the prospect’s data and behavior, and then subsequently triggers marketing actions and campaigns based upon marketing analysis. An appropriate analogy is to think of data-driven marketing from the consumer side in the average online shopping experience. When you purchase an item online, data-driven marketing strategies provide recommendations of complementary products to provide a better overall experience. If you’re looking at airfare rates for your next vacation to Hawaii, a data-driven marketing approach will focus on restaurants around the island with cuisine you regularly Google, potential places to stay based on positive reviews on Facebook, visitor’s guides that reflect your online budget-hunting practices and local activities such as scuba diving, listed on your LinkedIn profile.
By comparison, when you look at data-driven marketing from the marketer’s side, you’ll find a much more complex process. As you are able to obtain and update information on the customer from secondary sources, such as social media sites and web search data, you can create an approach that is customized to their buying behavior, interests, past purchases, web searches, social media posts and similar information. In other words, this approach allows you to optimize your funnel and customize your buyer journey to that particular prospect’s needs. You can also survey prospects to obtain primary sources of data, but be aware that there is often a bias between what individuals or groups claim versus their actual behavior. For example, an event attendee who was ranting about poor service at the luncheon one day may be raving about the closing keynote, leaving you with plenty of praise on the keynote but failing to mention the luncheon on the exit survey. Once you’ve obtained the data you need to make a comprehensive group, you can divide your prospects up into the personas they fit into best. This allows you to customize and personalize your approach, timing, channel and subject matter to optimize the results for each persona group.
The problem many marketers run into at in-person events is that they often don’t have the information they need to determine how to best engage each prospect. The closest option currently available are scans that provide contact information and basic registration information. But scans don’t provide the data you need to track that prospect’s engagement before, during, and after the event to prove the event ROI that particular group of prospects has generated for your company. As an example, at a recent conference, my badge was scanned by a gentleman from a company that prints promotional items. I was looking through the items in his booth to determine if there was anything I could use for our company’s next event. Though the exhibitor could have collected further data from me at the time, it would have been at the cost of other prospects that he could not help while gathering my information. When I returned home from the event, I had several recommendations for items that didn’t meet our needs because the minimum quantity was much too high, the quality wasn’t good enough and the prices were too expensive. The company had my contact information, but didn’t know enough about me or my organization to make appropriate recommendations. A data-driven marketing approach to this in-person event would have drastically improved my experience while increasing the marketer’s Event ROI.
How does data-driven marketing improve your ROI?
If you’re still wondering how data-driven marketing can make a difference to your company, you’re not alone. Though there was a 14% increase in confidence in putting big data to work in marketing departments from 2013 to 2014, with expectations for additional growth, many marketers still don’t know how the additional data provides a solid improvement in ROI or how to use the data to their company’s best advantage. In fact, companies that have implemented data-driven marketing into their marketing toolbox and recorded the results have often seen a 10-20% improvement on their ROI. Like any tool, it must be used correctly and implemented with other tools in your kit, such as using social media data, search analytics, SEO, content targeting and developing better buyer personas.
Why does data-driven marketing make such a big difference? Using the marketing convention example above, if the company had used data-driven marketing techniques to track my information, they would have known my organization was operating on a modest budget. All these factors made their special offer on a tri-fold brochure with a minimum order of 5,000 a very bad fit. Instead of learning more about the client, the company made a suggestion based on what was popular with their clients in general, few of whom had the needs of our organization, and lost a prospective sale. A targeted campaign based on data-driven marketing would have recommended a small-minimum product order that was inexpensive, while offering additional items that would have fit well with our company’s mission.
Click Image To Enlarge
Courtesy of an article titled "How a Data-Driven Approach Can Engage Customers and Boost Marketing Returns" appearing in SAS blog and an article dated May 26, 2016 appearing in Certain
A new study by the Economist Intelligence Unit has just been released that shows how big data is moving from its infancy to “data adolescence,” in which companies are increasingly meeting the challenges of a data-driven world.
The report, called “Big Data Evolution,” details the ways in which companies’ attitudes and activities have changed over the past four years with regards to big data — collecting it, storing it, analyzing it, and using it to make business decisions about strategy.
Data is becoming a corporate asset
The report shows that, since 2011, substantially more companies are treating their data as what it actually is:a strategic corporate asset. The initial excitement about the possibilities presented by big data is morphing into a more strategic approach, defining which data initiatives will have the biggest and most immediate impact.
I refer to this as asking the right questions. Companies are getting a bit savvier, and on the whole, are not asking for more data, but rather the right data to help solve specific problems and address certain issues.
Because of this greater understanding, executives are more likely to report they are making good, fact-based decisions about their data and their business.
In addition, data strategy has been elevated to the C-level, usually centralized with a CIO/CTO or a newly-appointed Chief Data Officer (CDO). Outside that position, executives across the board are more likely to be in charge of their departments’ particular data initiatives and instrumental in putting those resources to use.
Click Image To Enlarge
Click Image To Enlarge
Another important finding of the survey points to a strong correlation between good data management practices and financial success.
Companies with a well defined data policy, are much more likely to report that they are financially competitive with their peers and rivals. In addition, they’re more likely to report that their data initiatives are successful and effective at resolving real business problems.
The reason for this could be that data initiatives are moving out of the realm of theoretical possibility and well into reality, demonstrating the ability to focus on real business problems and provide practical solutions.
Less about volume and velocity than value
One final encouraging trend the report finds is that the “bigness” of big data is starting to wear off. Companies are less focused on the quantity of data they can collect and the speed at which they can access it, and more focused on the value the data can provide for their business.
I strongly believe that the right data is much more important and valuable than simply collecting more data, and the report bears that out.
As technology continues to improve, the “bigness” of big data will become less and less of a factor. Companies are becoming more comfortable with the idea that they will need to scale up to allow the value of data initiatives to reach all sectors of the business, and so they are becoming more comfortable with approximation, agility and experimentation.
From my point of view, all of these are positive signs that big data is moving out of its infant stages and is well on the way to data maturity.
Bernard Marr is a best-selling author, keynote speaker and business consultant in big data, analytics and enterprise performance. His new books are 'Big Data' 'Key Business Analytics'
COMMENTARY: The tone of corporate conversations about big data continues to shift from initial excitement to expecting long-term business impact.
Over the past four years, executives have not only become better educated about the technology behind big data, but have fully embraced the relevance of data to their corporate strategy and competitive success. It could be said that most companies are experiencing their “data adolescence”, increasingly rising to the challenge of executing and delivering against the promise and potential of big data.
What are the hallmarks of this current stage of evolution, and what does the path to “data adulthood” look like from here?
In February 2015, the Economist Intelligence Unit (EIU) conducted a global survey of 550 senior executives sponsored by SAS, to follow up on our 2011 and 2012 executive surveys. By comparing the results, we were able to examine the evolution of companies’ views, capabilities and practices regarding big data as a corporate asset, and explore the future implications as companies continue to mature as strategic data managers.
Additionally, EIU conducted six in-depth interviews with leading corporate big data thought leaders and practitioners. Two of these interviews revisited specific big data–related issues these companies faced beginning in 2011.
Key highlights of the research include the following:
Since 2011, a significantly larger proportion of companies have come to regard and manage data as a strategic corporate asset. The ranks of companies with well-defined data-management strategies that focus on identifying and analysing the most valuable data (referred to here as “strategic data managers”) have swollen impressively since 2011. No longer indiscriminate data collectors or wasters, companies are entering a period when the initial excitement over the possibilities presented by big data gives way to the need to prioritise and develop on data initiatives with the biggest payoff. More companies have ventured further into this stage of their data evolution, and their executives are more likely to feel that they are better at making good, factbased business use of their information.
Strategic data management is correlated with strong financial performance. Our survey points to a clear correlation between managing data strategically and achieving financial success. Companies with a well-defined data strategy are much more likely to report that they financially outperform their competitors. In addition, they are more likely to be successful in executing their data initiatives and effectively applying their data and analytics to resolve real and relevant business problems.
Data-strategy ownership has been elevated and centralised, while engagement and demand from the business is at an all-time high. Across industries, data strategy has been elevated and centralised to the C-level, most often with the CIO/ CTO or the newly minted chief data officer (CDO) role. At the same time, senior executives across functions and business units are increasingly in the driver’s seat of their data initiatives, and not just relying on IT leadership to design and execute them.
Data initiatives have moved from theoretical possibilities to focus on solving real and pressing business problems. Companies approach data initiatives today with a clear focus on their purpose—putting business value first. They are much more likely to start by articulating and finding a consensus on the high-priority business problems the organisation will solve by leveraging its data assets. Financial resources available for big data initiatives remain scarce, so there is a pronounced need to prioritise which initiatives to invest in, as well as how to demonstrate the financial return on these investments.
Technical challenges associated with quality, quantity and security persist. Even top performers continue to struggle with a number of technical aspects of big data. These foundational aspects of data management still drown out the more advanced, higher-value-add aspects of data management, such as governance, compliance and converting data into actionable insights.
The future of big data is less about volume and velocity, and more about the value that the business can extract from it.Going forward, companies will have to shift their attention away from the “bigness” of big data and focus on its business value. Data and analytics will be increasingly applied to predict future outcomes and automate decisions and actions. Most importantly, many companies will have to continue to evolve their structure and culture to scale up successful data pilots across the entire organisation. This means becoming more comfortable with approximation, agility and experimentation, and reinventing themselves into a new kind of information-driven, data-centric business—closer to data adulthood.
CIO's Now Consider Big Data Analytics A Game-Changer
Greg Taffet, CIO of U.S. Gas & Electric, when The Economist Intelligence Unit interviewed him back in 2011, said.
“It is going to be a game changer.”
He was referring to fast-moving, real-time “big data”—which, at that time, was a novel buzz word.
In EIU's first comprehensive study in 2011 of how companies perceive and handle big data as a corporate asset, just 9% of survey respondents said data had completely changed the way they do business, while 39% believed data had become an important tool that drives strategic decisions at their organisation. But more than half of executives saw data in less critically important terms.
Click Image To Enlarge
As you can see from the above EUI survey, 58% (First two colums for 2015 survey: 14% + 44%) of respondents now see data as a game-changing asset, or at least, an important decision-making tool.
In the EUI 2011 study, four categories of companies were identified based on their level of sophistication of their thinking and strategy vis-à-vis corporate data:
Strategic data managers: companies that have well-defined data-management strategies that focus resources on collecting and analysing the most valuable data;
Aspiring data managers: companies that understand the value of data and are marshalling resources to take better advantage of them;
Data collectors: companies that collect a large amount of data but do not consistently maximise their value; and
Data wasters: companies that collect data, yet severely underuse them.
EUI plotted the four corporate categories and compared 2011 vs 2015 in the following chart:
Click Image To Enlarge
The above chart (Figure 2) shows that, in the last four years, companies have advanced along the evolutionary curve and, compared with 2011, many more now have developed a well defined data strategy. The ranks of strategic data managers have swollen impressively (18% vs 33%), and actually showed the only growth among the four categories, while the number of data collectors and wasters is shrinking (28% to 20%).
Further evidence that companies are moving beyond strategy development and are tackling the adoption, or implementation, stage of data evolution is the fact that executives today put more of their valuable data to good use (see Figure 3).
Click Image To Enlarge
Alan Feeley, managing director of global shared services at Siemens, a global engineering firm, points out.
“Data and analytics are no longer opportunistic. They are now formal research areas for our company.”
The CIO and CTO Have Now Taken Ownership of Big Data
The ownership of data strategy and the sponsorship of data initiatives have evolved throughout the organisation. Responsibility for the organisation’s data strategy has been elevated and centralised to the C-level, but at the same time, the pull and energy are increasingly coming from the lower levels of the corporate pyramid. Over half of companies surveyed make sure that data are available to employees who need them, and offer the appropriate technology and training programmes. Data strategy has become “everybody’s business”—senior executives across functions and business units are increasingly in the driver’s seat of their data initiatives, instead of relying on the CIO or CTO to design and execute them in a top-down manner.
The vertical migration to centralised leadership of data strategy and strong ownership from the C-suite is an emerging best practice today. Ram Chandrashekar, executive vice-president of operational excellence and IT and president of Asia Pacific and Middle East region at ManpowerGroup, a global human-resources consulting company, says.
“Clearly, a top-down data strategy driven and articulated by the CEO is a critical success factor.”
The EUI Survey data support his observation.
Over the past four years, ownership of corporate data strategy has migrated upwards from executives at the business-unit level to C-suite members—particularly, the CIO. In 2011, 23% of respondents said their CIO is primarily responsible for all data initiatives. This proportion jumped to 30% in 2012, and continued to rise to 39% in 2015 (Figure 7 below).
Click Image To Enlarge
A recent appearance in our 2015 survey is the increasingly popular chief data officer (CDO) role. This C-level position was virtually unknown in 2011—limited mostly to government and heavily regulated industries such as banking and insurance following the 2008 financial crisis. In our 2015 survey, some 9% of respondents pointed to their CDO as the custodian of the corporate data strategy and capabilities. Emergence of this role comes at a good time, especially as business executives from across the functional spectrum have become much more technology-literate and involved in the design and execution of their data strategy and initiatives.
Emergence of the Chief Data Officer (CDO)
Increased involvement from the business comes with the challenge of co-ordinating agendas, aligning priorities and communicating effectively with all stakeholders. Mr Chandrashekar of ManpowerGroup says.
“There is strong alignment and articulation at the C-level. People on the frontline, such as sales and operational staff, are also data-driven. The disconnect often happens in the middle, and the challenge is to make the data flow from top to bottom. Engaging the business is critical—data strategy cannot be seen as just a central initiative.”
And few today excel at engaging the business. In the EUI 2015 survey, when asked to rate their company’s competence across different datarelated corporate capabilities, respondents expressed the least confidence in their ability to engage employees across the organisation to use data in day-to-day decision-making (only 26% rated their company as “very competent”, while 22% saw themselves as “not at all competent”). High-quality, consistent engagement across layers of the organisation and among horizontal functional lines is in high demand, and in short supply.
Click Image To Enlarge
Enter the CDO. Mr Krishnamurthy of Cognizant Technology Solutions says.
“The CDO has emerged as the embodiment of ‘integrated leadership’.”
He points out that the best-designed CDO roles are focused on three top-level priorities:
Ensuring availability and integrity of data across the organisation;
Driving adoption—from small-scale pilots to company-wide rollouts; and
Driving the monetisation of new data capabilities.
Most importantly, the role is about organisational engagement, brokering between agendas and balancing priorities among big data initiatives. Thus, finding the right senior talent to fill the CDO role can be tricky, as Edd Dumbill, vice-president of marketing and strategy at Silicon Valley Data Science, a big data consulting firm, points out:
“They have to know technology, they have to know the business, and they have to be a political wiz.”
Foundational and talent challenges
Companies have made great strides in embracing data as a strategic asset, making the necessary technology investments, and even beginning to evolve their corporate structure. Centralised leadership allows for better co-ordination in strategy and execution of initiatives. And executives, both on the business side and in IT, are much more focused on deploying their limited resources on top-`priority data projects that extract tangible business value from these investments.
However, significant challenges still plague most companies—and that’s true even for companies with the financial resources. The most daunting challenges companies face relate to three. highly technical and operational aspects of big data—quality, quantity and security (see Figure 8). These are fundamental aspects of data management. Yet companies are far from having resolved them completely and with full confidence, leading to a lack of progress to more advanced, value-added aspects of data management.
In the last four years, the problems posed by the overwhelming amount of data companies can access and collect have only been exacerbated further. In 2011, one in eight companies said they had so much data that they struggled to make sense of them—in 2015 this was nearly one in four companies. And today, more than half of executives (54%) say they probably leverage only half of their valuable data (Figure 3).
Given the sheer volumes, ensuring the integrity and quality of data, and arriving at the proverbial “single source of truth”, are still major problems. And thus, the ultimate challenge of extracting meaningful and actionable business knowledge from data is still a significant one for most companies, even slightly more so for companies that say that they are strong financial performers as they may be more ambitious with their data strategy. But only 16% of companies these cite extracting business insights as a top challenge—for reported poor financial performers, this was 24%. Despite strong or poor financial performance, 33% of all survey respondents continue to struggle with managing the vast amount of data and 41% struggle with maintaining quality (Figure 9).
Click Image To Enlarge
On the organisational front, companies have made strides in both creating the right structures and roles, as well as recruiting key talent to enable them to formulate and begin executing their data strategy. However, the talent market in the data and analytics field is still very tight.
This is especially still true in the market for data strategists—executives who are expected to speak the languages of both technology and data science, as well as understand the business, the markets and the customers (see section Paving the way for the CDO). These rare and invaluable executives—the “effective engagers”, as Ms Merkel of Zurich Insurance calls them—are in short supply and high demand. As Mr Feeley of Siemens puts it,
“There’s a war for talent, particularly for people who combine data expertise with domain knowledge.”
In a blog post dated December 8, 2015, in which I wrote about the market for data scientists, I mentioned the serious talent shortage in big data and analytics.
Courtesy of an article dated November 30, 2015 appearing in Forbes and the whitepaper prepared by The Economist Intelligence Unit titled, "Big Data Evolution"
What are the key roles in data science? Here's a look at the various job positions in the data science industry and what they mean.
A data scientist cleans, massages, and organizes Big Data, according to DataCamp in the following infographic.
A database administrator, on the other hand, focuses on backup and recovery, data modeling and design, distributed computing, database systems, and data security.
Another role is statistician. This person "collects, analyzes, and interprets qualitative as well as quantitative data with statistical theories and methods," according to DataCamp.
To find out more about the different roles in data science, click or tap on the infographic.
Click Image To Enlarge
COMMENTARY:Data-driven business processes are not a nice-to-have but a need-to-havecapability today. So, if you’re an executive, manager, or team leader, one of your toughest assignments is managing and organizing your analytics and reporting initiative.
The days of business as usual are over. Data generation costs are falling everyday. The cost of collection and storage is also falling. The speed of insight-to-action business requirement is increasing. Systems of Record, Systems of Engagement, Systems of Insight are being transformed with consumerization and digital.
With this tsunami of data and new applications, the bottleneck is clearly shifting from transaction processing to Analytics & Insight-driven “sense-and-respond” Action. This slide from IBM’s Investor Briefing summarizes the data-driven transformation underway in most businesses.
Courtesy of an article dated December 8, 2015 appearing in MarketingProfs and an article dated May 28, 2015 appearing in Ravi Kalakota's Practical Analytics
The huge demand for big data analytics, and scarcity in individuals with the right skills and experience, is driving the starting salaries for data scientists over $200,000, and it is going to get worse
A new species of techie is in demand these days—not only in Silicon Valley, but also in company headquarters around the world. Pascal Clement, the head of Amadeus Travel Intelligence in Madrid, says.
“Data scientists are the new superheroes.”
The description isn’t exactly hyperbolic: The qualifications for the job include the strength to tunnel through mountains of information and the vision to discern patterns where others see none. Clement’s outfit is part of Amadeus IT Holding, the world’s largest manager of flight bookings for airlines, which has more than 40 data scientists on its payroll, including some with a background in astrophysics. The company recently launched Schedule Recovery, a product that tracks delays and automatically rebooks all affected passengers.
A study by McKinsey projects that “by 2018, the U.S. alone may face a 50 percent to 60 percent gap between supply and requisite demand of deep analytic talent.” The shortage is already being felt across a broad spectrum of industries, including aerospace, insurance, pharmaceuticals, and finance. When the consulting firm Accenture surveyed its clients on their big-data strategies in April 2014, more than 90 percent said they planned to hire more employees with expertise in data science—most within a year. However, 41 percent of the more than 1,000 respondents cited a lack of talent as a chief obstacle. Narendra Mulani, senior managing director at Accenture Analytics, says.
“It will get worse before it gets better.”
Many data scientists have Ph.D.s or postdoctorates and a background in academic research, says Marco Bressan, president for data and analytics at BBVA, a Spanish bank that operates in 31 countries and has a team of more than 20 data scientists. He says.
“We have nanotechnologists, physicists, mathematicians, specialists in robotics. It’s people who can explore large volumes of data that aren’t structured.”
So-called unstructured data can include e-mails, videos, photos, social media, and other user-generated content. Data scientists write algorithms to extract insights from these troves of information. But “true data scientists are rare,” says Ricard Benjamins, head of business intelligence and big data at Telefónica, Europe’s second-largest phone company, which employs more than 200 of them. Stan Humphries, chief economist at Zillow, the real estate listings site, says:
“You can find a great developer and a great researcher who has a background in statistics, and maybe you can find a great problem solver, but to find that in the same person is hard.”
Universities are taking note. MIT, where graduate students in physics, astronomy, and biology are fielding offers from outside their chosen fields, is in the process of setting up a dedicated data-science institute. Marilyn Wilson, the university’s associate director for career development, says the center will begin enrolling graduate degree candidates in 2016.
In the U.K., the University of Warwick introduced a three-year undergraduate data-science program last year, which David Firth, the program’s mastermind, says may well be the first of its kind. He says.
“Big Business was complaining about the lack of people. Finance is a major employer, but also large-scale insurers, large online commercial retailers, high-tech startups, and government, which has huge data sets.”
Accenture’s Mulani says he’s tallied some 30 new data-science programs in North America, either up and running or in the works. The University of Virginia began offering a master’s in 2014, as did Stanford. Many of those students may be tempted to drop out before collecting their degree. Margot Gerritsen, director of Stanford’s Institute for Computational & Mathematical Engineering, says.
“Companies are scrambling. We have second- and third-year students getting offered salaries much higher than what I get.”
Starting pay for some full-time jobs is above $200,000, she reports. Summer internships, meanwhile, pay anywhere from $6,000 to $10,000 a month. To make these stints memorable, many employers offer perks such as free meals, complimentary gym memberships, and occasionally temporary housing. Gerritsen says.
“Sometimes you read about students getting abused in internships and working like slaves. We don’t see that.”
The bottom line: McKinsey projects that by 2018 demand for data scientists may be as much as 60 percent greater than the supply.
COMMENTARY: There is no doubt that data scientists are in high demand. There is simply not enough talent to fill the jobs. Why? Because the sexiest job of 21st century requires a mixture of broad, multi-disciplinary skills ranging from an intersection of mathematics, statistics, computer science, communication and business. Finding a data scientist is hard. Finding people who understand who a data scientist is, is equally hard.
Jean-Paul Isson, Monster Worldwide, Inc. says.
“Being a data scientist is not only about data crunching. It’s about understanding the business challenge, creating some valuable actionable insights to the data, and communicating their findings to the business.”
It is very unlikely that you will be able to hire a data scientist who can solve all your data problems. The skill-set presented below is a guide on how the modern data team should be equipped.
Click Image To Enlarge
What Makes a Great Data Scientist?
What are the personality traits of today’s data scientists? Do they have the skills needed to support businesses’ changing needs? How can senior management build cohesive teams to drive the most value from big data? Is this fast evolving discipline causing stress to its practitioners? These are some of the questions that SAS Analytics tried to answer when it conducted a survey in 2014 of 596 respondents in the UK and Ireland who identified themselves as part of the data science profession. The survey highlights 10 key personality types as well as some of the challenges data scientists are facing today.
Summary of Findings
Data scientists with “traditional” traits (analytical, logical, technical) make up the largest group (41 per cent), yet a significant proportion of data scientists display other traits, such as strong communication and creativity skills.
We expect a lot from data scientists: to be technically proficient, mathematically agile, business savvy and good at communicating. Yet 55 per cent of data scientists have fewer than three years of experience in the discipline, and more than a quarter are adapting their behaviours to fulfil roles that are not well matched to their skills or work personality profiles.
In most personality types there is a 70:30 split between men and women. The Geeks, the profile with technical bias, strong logic and analytical skills, shows a higher percentage of women at 37 per cent compared to 63 per cent for men.
Data scientists are exhibiting high levels of work-related stress, with a total of 55 per cent of the respondents showing a level of stress, with 1 in 4 male and just under a third of female data scientists being heavily stressed.
Organisations must better identify and define what they need from data scientists. Only then can they build and develop multi-faceted teams with the complementary skills needed to realise the full value of big data.
This SAS survey report looked at the aggregated data of the initial 596 responses, received from 405 men (68 per cent) and 191 women (32 per cent). Respondents’ experience ranged from less than a year to a handful of data science pioneers who have committed more than 21 years to the discipline.
The survey was based on the well-established DISC profiling methodology, which has been used for more than half a century to categorise respondents into a range of personality types with recognised characteristics.
By using DISC, SAS able to create the typical profiles of today’s data scientists, and explore how these compare to skills and personalities required by organisations.
The DISC Personality Profiling Methodology
The DISC methodology has been widely used to assess personality types for decades. This information is often utilised to analyse and build effective teams, improve communication and increase productivity. DISC is based upon a two-axis matrix:
Vertical Axis:Proactive to Reactive
Horizontal Axis:Introvert to Extrovert
This generates four possible combinations – named D, I, S and C. Individuals possess measures of all four of these combinations but in varying amounts. The values for each combination is calculated and plotted on a graph to illustrate the relative strengths of each. These values can be matched to known behavioural characteristics to describe an individual’s personality profile.
Click Image To Enlarge
The Data Scientist Personality Mix
The SAS analysis identified ten psychometric profiles evident within the data scientist community, based on the responses to the survey so far. These profiles are characterised by definable patterns that are well known to scientists and were consistently apparent in the survey sample.
The ‘traditional’ traits associated with data scientists – such as technical, analytical and logical skills – still dominate. However, other less technical traits – such as project management, creativity and good communication skills – are also present. Organisations need data savvy individuals who are technically proficient, mathematically minded, business oriented and strong communicators.
It’s unlikely that any individual will have all of the skills required to maximise the value of big data. So it’s important for managers to identify the particular skills needed and build a cohesive team of individuals with complementary skills and traits. As we will see later in the report, failure to do so can result in individuals trying to fulfil roles to which they are not suited – which may lead to stress and burn-out.
Distribution of Personality Types in SAS Survey Sample
Click Image To Enlarge
Final Conclusions Drawn From The Survey
There is huge pressure on data scientists to display a range of skills: technical, mathematical, creative, business aptitude and communication. Unsurprisingly, organisations struggle to find this mix of skills in just one person, particularly when the majority of data scientists have only a few years of experience.
It is therefore important for managers to build cohesive, diverse teams that can answer all the needs of the organisation – not just data scientists but also data savvy managers to interpret the insights and transform them into actions. Organisations will need to:
Manage the business’ expectations from data scientists
Find and develop individuals with the appropriate skills and personality traits
Create teams of individuals with complementary skills and experience
Encourage peer to peer support, learning and development
Match training to the business’ requirements.
It’s possible that, as technology develops, there will be less of a weighting towards the more traditional, technical personality traits – while demand for the less frequently found profiles such as the Ground Breakers, the Teachers and the Seekers will grow. As more respondents take part in SAS surveys, it is hoped to be able to explore this interesting development further.
“The dilemma is, as it is often said, correlation does not imply causation. The discovery of a predictive relationship between A and B does not mean one causes the other, not even indirectly. No way, nohow.” - Eric Siegel, Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie or Die
What is predictive analytics?
Predictive analytics is the use of data, statistical algorithms and machine-learning techniques to identify the likelihood of future outcomes based on historical data.
The goal is to go beyond descriptive statistics and reporting on what has happened to providing a best assessment on what will happen in the future. The end result is to streamline decision making and produce new insights that lead to better actions.
Predictive models use known results to develop (or train) a model that can be used to predict values for different or new data. The modeling results in predictions that represent a probability of the target variable (for example, revenue) based on estimated significance from a set of input variables. This is different from descriptive models that help you understand what happened or diagnostic models that help you understand key relationships and determine why something happened.
More and more organizations are turning to predictive analytics to increase their bottom line and competitive advantage using predictive analytics. Why now?
Growing volumes and types of data and more interest in using data to produce valuable information.
Faster, cheaper computers and easier-to-use software.
Tougher economic conditions and a need for competitive differentiation.
With interactive and easy-to-use software becoming more prevalent, predictive analytics is no longer just the domain of mathematicians and statisticians. Business analysts and line-of-business experts are using these technologies as well.
What is predictive analytics?
A 2014 TDWI report found that the top five things predictive analytics are used for is to:
Identify trends.
Understand customers.
Improve business performance.
Drive strategic decision making.
Predict behavior.
Some of the most common applications of predictive analytics include:
Fraud detection and security – Predictive analytics can help stop losses due to fraudulent activity before they occur. By combining multiple detection methods – business rules, anomaly detection, link analytics, etc. – you get greater accuracy and better predictive performance. And in today’s world, cybersecurity is a growing concern. High-performance behavioral analytics examines all actions on a network in real time to spot abnormalities that may indicate occupational fraud, zero-day vulnerabilities and advanced persistent threats.
Marketing – Most modern organizations use predictive analytics to determine customer responses or purchases, as well as promote cross-sell opportunities. Predictive models help businesses attract, retain and grow the most profitable customers and maximize their marketing spending.
Operations – Many companies use predictive models to forecast inventory and manage factory resources. Airlines use predictive analytics to decide how many tickets to sell at each price for a flight. Hotels try to predict the number of guests they can expect on any given night to adjust prices to maximize occupancy and increase revenue. Predictive analytics enables organizations to function more efficiently.
Risk – One of the most well-known examples of predictive analytics is credit scoring. Credit scores are used ubiquitously to assess a buyer’s likelihood of default for purchases ranging from homes to cars to insurance. A credit score is a number generated by a predictive model that incorporates all of the data relevant to a person’s creditworthiness. Other risk-related uses include insurance claims and collections.
Click Image To Enlarge
Predictive analytics use across industries -- real life examples
Any industry can use predictive analytics to optimize their operations and increase revenue. Here are a few examples:
Credit card, banking and financial services. Detect and reduce fraud, measure credit risk, maximize cross-sell/up-sell opportunities, retain customers and optimize marketing campaigns. Commonwealth Bank can reliably predict the likelihood of fraud activity for any given transaction before it is authorized – within 40 milliseconds of the transaction being initiated.
Governments and the public sector. Improve service and performance; detect and prevent fraud, improper payments and the misuse of funds and taxpayer dollars; and detect criminal activities and patterns. The Hong Kong government visualizes and analyzes big, unstructured data to anticipate and address public complaints.
Health care providers. Predict the effectiveness of new procedures, medical tests and medications, and improve services or outcomes by providing safe and effective patient care. Taipei Medical University executives analyze and monitor performance across all hospitals in its system.
Health insurers. Detect and handle insurance claims fraud, identify which patients are most at risk of chronic diseases and know which interventions make the most medical and financial sense. One of the largest pharmacy benefits companies in the US, Express Scripts, uses analytics to identify patients not adhering to their prescribed treatments, resulting in a savings of $1500 to $9000 per patient.
Insurance companies. Determine insurance premium rates, detect claims fraud, optimize claims processes, retain customers, improve profitability and optimize marketing campaigns. Within two hours of an earthquake striking rural New Zealand, Farmers Mutual Group assessors were headed to affected areas. With SAS Analytics, they knew who their most at-risk policy holders were and chartered a helicopter to get to them quickly.
Manufacturers. Identify factors leading to reduced quality and production failures, and optimize parts, service resources and distribution. Lenovo used predictive analytics to better understand warranty claims, leading to a 10 to 15 percent reduction in warranty costs.
Media and entertainment. Deepen insight into audiences by identifying influencing attributes, trends, drivers and desires across properties, and score visitors to determine appropriate audience segments and behavior value. How is the slot floor doing every day? How is the gaming floor performing? How are the nonsmoking tables compared to the smoking tables? The answers – which previously could take numerous weeks and many dollars to find out – are now coming in minutes and at a far lower cost for Foxwoods Resort Casino.
Oil, gas and utility companies. Predict equipment failures and future resource needs, mitigate safety and reliability risks, and improve performance. Salt River Project is the second-largest public power utility in the US and one of Arizona's largest water suppliers. A sophisticated forecasting model helps them know the best time to sell excess electricity for the best price.
Retailers. Assess the effectiveness of promotional events and campaigns, predict which offers are most appropriate for consumers, determine which products to stock where and how to build brand loyalty.Staples analyzes online and offlilne consumer behavior to provide a complete picture of their customers, and realized a 137% ROI.
Sports franchises. Sports analytics is a hot area, thanks in part to Nate Silver and tournament predictions. The NBA’s Orlando Magic uses SAS predictive analytics to improve revenue and determine starting lineups.
Click Image To Enlarge
What do you need to get started?
Problem to solve. The first thing you need to get started using predictive analytics is a problem to solve. What do you want to know about the future based on the past? What do you want to understand and predict? You’ll also want to consider what will be done with the predictions. What decisions will be driven by the insights? What actions will be taken?
Data. Second, you’ll need data. In today’s world, that means data from a lot of places. Transactional systems, data collected by sensors, third-party information, call center notes, web logs, etc. You’ll need a data wrangler, or someone with data management experience, to help you cleanse and prep the data for analysis. To prepare the data for a predictive modeling exercise also requires someone who understands both the data and the business problem. How you define your target is essential to how you can interpret the outcome. (Data preparation is considered one of the most time-consuming aspects of the analysis process. So be prepared for that.)
Modeling building tools. After that, the predictive model building begins. With increasingly easy-to-use software becoming more available, a wider array of people can build analytical models. But you’ll still likely need some sort of data analyst who can help you refine your models and come up with the best performer. And then you might need someone in IT who can help deploy your models. That means putting the models to work on your chosen data – and that’s where you get your results.
Predictive modeling requires a team approach. You need people who understand the business problem to be solved. Someone who knows how to prepare data for analysis. Someone who can build and refine the models. Someone in IT to ensure that you have the right analytics infrastructure for model building and deployment. And an executive sponsor can help make your analytic hopes a reality.
COMMENTARY: According to Malene Haxholdt, Global Marketing Manager, Business Analytics, SAS, there are several good reasons why businesses should start predictive analytics initiatives.
Growing revenue.
Lowering costs.
Establishing governance and compliance.
Companies are looking to get value from predictive analytics in many business areas. The value comes when you can take data, apply analytics, and act on the results. The value ultimately means growing revenue, lowering costs, or establishing governance and compliance. Growing revenue is often associated with customer analytics and being better at retention modeling and cross-sell and up-sell activities. Lowering costs often comes from analytics used to improve processes as well as from using internal capacity in any form. Especially in the financial industry, the driver for predictive analytics can be part of a compliance and governance strategy.
Predictive analytics is relevant and useful across all industries. Some industries are more mature in their use and implementation of predictive analytics. The most common applications of predictive analytics come from a need to better manage fraud, risk, equipment failure, or customer interactions. What are common across industries are the accelerating growth of data and the desire to turn it into valuable information. Predictive analytics is a component of that journey.
It is key to remember that predictive analytics is only valuable if you can turn the results of analytical models into actions. The business process, the people, and the technology need to be aligned to successfully deploy predictive analytics. Think of predictive analytics as part of a life cycle that consists of (1) managing all of your data, (2) exploring all of your data, (3) building your models with the best techniques, and (4) deploying and monitoring your models.
It is important that the organization believes and understands that predictive analytics is driving better decisions. The software tools that allow you to start building predictive models are very approachable and do not require you to be a technical expert. We see a tremendous uptick in interest in interactive predictive modeling using visual analytics and visual statistics. Without any coding required, you can start exploring your data and get value. More mature companies are hiring more data scientists to further grow and explore the possibilities predictive analytics brings. Having human skills that can align the technology and business understanding is key to success.
With the reality of big data, new techniques are being explored by companies to leverage the value hidden in new types of data. Being able to explore all of your data quickly and in an interactive manner is driving the need for data visualization techniques and interactive predictive modeling on very big amounts on data—fast. Also, we see a growing interest in machine-learning techniques such as Random Forest and techniques to handle unstructured data. Take, for example, SAS Contextual Analysis. This next-generation text modeling software combines the ease of machine learning with subjectmatter expertise, enabling powerful text models to easily be defined from unstructured data.
To get sustainable value from predictive analytics, IT and business users are both key in the process. A modern CIO and IT department work closely with the business to enable predictive analytics throughout the organization by providing data access and approachable analytics tools to the right users. Think of it as self-service analytics. The IT department is, in some cases, the true driver of innovation because it can enable the use of all data and help get even more predictive power in the models used by the business. IT’s role is critical in selecting an architecture that will meet future demands around diverse analytical data preparation tasks, reducing model building latency and quickly deploying models into operational systems.
All companies, no matter the size, will benefit from making better decisions in a world of uncertainty. We see small companies that apply predictive analytics as a key strategy to growth. In fact, companies that embed predictive analytics as a cornerstone in their business tend to be more successful in the long run.
This week, Walmart's shares crashed on October 14 after the company reported a disappointing profit outlook. Profits will fall 6% to 12% next year, the company said.
Walmart founder Sam Walton wrote in his 1992 memoir "Sam Walton: Made in America".
"Now, when it comes to Wal-Mart, there's no two ways about it: I'm cheap,"
If the notoriously cheap Walton were alive today, he would probably be surprised that his pioneering profit formula -- keep costs low so that prices are as low as possible and customers buy more -- is now virtually impossible to maintain. Full blame for that can be pinned on employees demanding higher wages and the rise of e-commerce giants such as Amazon(AMZN), which are open 24 hours a day, seven days a week.
Walmart founder Sam Walton (Click Image To Enlarge)
Perhaps the retail legend saw some of this on display from up above on what was a dark day in Walmart's history on Wednesday. Walmart's usually stable stock price tumbled 11% as the company said at its annual investor day that higher wages for associates will dent profits by about $1.5 billion in the 2017 fiscal year, following a $1.2 billion hit this fiscal year.
For years, Walmart has had the upper hand with its employees, paying them state minimum wages, offering basic healthcare plans, and using computerized systems to control their hours. But with the rise of social media and the internetgiving employees a free platform to express views, and the rising cost of living causing state governments to push through higher minimum wages, Walmart has had to change with the times.
As a result, it has lost a key element of its business model put in place by Walton that has supported the company's profits since its founding. Further, rising employee costs are arriving as online shopping is also limiting the ability of Walmart to raise prices at all to help deliver sustainable profit growth.
Walmart added on Wednesday that it expects earnings per share to decline between 6% and 12% next year. Wrote Sterne Agee analyst Charles Grom in a note to clients on Wednesday :
"What's surprising to us is that the new outlook is incorporating roughly $20 billion in share buybacks in the next two years, which implies significant margin contraction along with modest sales growth."
Employee costs aren't the only line item on the income statement Wal-Mart is battling to control.
Walmart's investments in e-commerce and digital initiatives are expected to total about $1.1 billion in the 2017 fiscal year. This year, Walmart is projected to shell out $1.2 billion to $1.5 billion on e-commerce and digital.
These investments are a necessary evil to stay competitive with Amazon, as well as with Best Buy(BBY) and Target(TGT) , which are continuing down their own paths to reduce prices, improve checkout speeds and offer a range of delivery options.
When Walton opened the first Walmart store in Rogers, Arkansas in 1962, though, it's unlikely he foresaw a day when a sweater could be purchased off a handheld device. Today it can be, however, and to make it happen seamlessly and at the lowest cost is proving to be a major profit headwind for the once mighty Walmart.
Even with such enormous investments in digital, however, Walmart still sees sales as being under pressure. Walmart said it now expects net sales growth for the current fiscal year to be relatively flat, down from a forecast of between 1% and 2% in February. Excluding the impact of currency exchange fluctuations -- mostly the stronger dollar -- net sales growth is estimated to be about 3%.
Goldman Sachs analyst Matthew Fassler in a note on Wednesday pointed out.
"New guidance reflects that Walmart's competitive edge -- historically largely assortment and price -- has faded relative to purveyors of extreme value (warehouse clubs, hard discounters) or extreme convenience (dollar stores, hard discounters), as e-commerce has neutralized the impact of selection."
The deterioration of Walton's pioneering business model has been playing out for some time in Walmart's numbers.
According to Bloomberg data, Walmart's operating profit margins, which takes into account its costs to run stores and pay for wages and healthcare, has fallen from 7.1% in 1987 to 5.6% in 2015. From 1988, Walmart's return on assets -- an indicator of how profitable a company is relative to its total assets -- has gone from 13.7% to 8.0% in 2015.
The alarming downtrends in these measures, partly fueled by investments to compete online and compensate workers better, have been overlooked for years by Walmart's investors who have focused on the retailer's healthy dividend and relative stability.
Walmart CEO Doug McMillon (Click Image To Enlarge)
But the stock's nosedive on Wednesday, and an 11% drop for the year prior to that point, suggests Walmart's investor base is growing concerned that the retail giant's business is broken. Wrote Stifel analyst David Schick in a note,
"The market is reacting to meaningful evidence that Walmart has substantially over-earned."
In other words, Walmart is now being forced to play catch up on investments it neglected to make in the past to keep the profits flowing.
And if Walmart's business model is broken, profits are likely to stay under pressure for some time, and future dividend hikes may not be as robust as in years past.
Now, Walmart has to find a way to channel Walton and become cheap once more. Indications have emerged that it may be willing to take drastic actions to return to some sense of frugality.
Walmart CEO Doug McMillon said at the investor event.
"We are more than open to re-shaping our portfolio."
McMillon added the company continues to "evaluate its portfolio" of assets, and pointed to Walmart's history of exiting non-strategic assets and closing underperforming stores.
For example, the retailer exited Germany in 2006 and took a $1 billion hit to profits as a result. By shedding assets such as Sam's Club or lagging international operations, Walmart would also be shedding costs related to employees and maintaining physical stores. In turn, those savings could be used to offset the investments being made in core assets such as the online business and Walmart U.S., or simply be brought to the bottom line to appease investors.
Walton wrote.
"You can make a lot of different mistakes and still recover if you run an efficient operation -- or you can be brilliant and still go out of business if you're too inefficient."
Words of wisdom from beyond the grave for McMillon and the execs tasked with turning around the largest ship in retail.
Walmart did not respond to request for comment for this story.
COMMENTARY: When I heard that Walmart's stock had plummeted 10% in mid-October over its shitty earnings report, decling profit margins and the CEO's statements that earnings will continue to be stagnant due to higher employee wages and investments in ecommerce, I shed crocodile tears and managed a smile.
Walmart has been the "king of exploiters." For years it has exploited its employees, vendors and foreign suppliers.
Walmart has been the subject of numerous lawsuits from employees seeking payment for overtime and unfair labor practices, including paying female managers less than their male counterparts, among other sins.
For years Walmart has squeezed its vendors for lower prices, often forcing some into bankruptcy, while others have fled the retail giant, and swore never to return again.
The pricing deals that it has extracted from Chinese suppliers has cost hundreds of thousands of US jobs, and many U.S. small businesses have gone out of business.
So, if you were to ask, I don't feel the slightest remorse for this disgusting company? Nope.
The Walton family, prior to the recent stock market decline, were worth $140 billion, the wealthiest American family, and representing the very top of that 1% which owns 90% of America's total net worth. After the stock price crash, they are still worth at least $125 bllion. Now I hear they are blaming the far left's campaigns to increase the minimum wage for cause of their problems.
From my viewpoint, the cause of Walmart's problems is mostly about Walmart's values, and they are all about profits, and more profits, at any cost.
Do you think the Walton family cares about all the small retailers they have put out of business, or the Chinese workers being exploited by suppliers so that they can pay bottom dollar for their goods, the US vendors they squeezed for lower prices then put out of business, or even their employees who are finally getting paid minimum wage (a paltry $9.00 per hour)?
This is more of a condemnation of Walmart's values coming back to bite them in the ass. It is not about their business model going to pieces. You can bet that Walmart's CEO Doug McMillon and his team will find ways to restore the viability of their dirty business model by screwing somebody. Stay tuned for more bad news about this indecent company.
The Wharton Business Radio ran an excellent broadcast titled Marketing Matters Walmart, that details where Walmart fucked up, their failure to react to the dollar discount stores, their botched entry into Canada, their badly executed store remodelling, and failure to gain traction in online sales to combat Amazon, and what is really going on, to get them into this disasterous hole. Definitely worth listening to.
Courtesy of an article dated October 16, 2015 appearing in The Street, an article dated February 19, 2015 appearing in Newsmax, an article dated October 16, 2015 appearing in Business Insider, and an article dated October 14, 2015 appearing in Fortune Magazine
Data-driven marketing, in theory, is about better using the flood of customer-related data to integrate and optimize marketing efforts in the age of an empowered consumer and Big Data.
What matters most is the optimization of the customer experience, relevance and (perceived) customer value as a driver of business value. Data-driven marketing certainly is not (just) about advertising and programmatic ad buying as some believe. Nor is it just about campaigns. On the contrary: if done well, data-driven marketing is part of digital marketing transformations whereby connecting around the customer across the customer life cycle is key.
The Role and Evolution of Data-Driven Marketing
One of the pioneers and leading voices is data analytics and marketing software vendor Teradata. A while ago, the company recently released the results of its Teradata 2015 Global Data-Driven Marketing Survey. According to Teradata, marketers who adopt an integrated and data-driven marketing approach become more empowered across all areas including the various stages of “engagement” as described in the illustration below from the page of the 2013 survey the company conducted.
The Various Stages of Customer Engagement - Teradata (Click Image To Enlarge)
At the occasion of the launch of the 2015 edition of the survey, Teradata said it saw some “dramatic shifts in how companies and marketers are deriving business value from data, integrated marketing platforms, and customer-centric data-driven marketing strategies” as compared with its 2013 report.
Among the key findings of the Teradata 2015 Global Data-Driven Marketing Survey:
A large majority of marketers (90%) wants to move beyond segmentation towards one-to-one personalization in a real-time interaction context.
Faster decisions and more accurate decisions are reported as key benefits of using data in this real-time economy. Two-thirds of respondents cite speed and accuracy in decisions as key benefit.
For 38% of respondents the major challenge is improving both customer acquisition and customer retention.
78% of marketers in the survey claim to use data systematically. In 2013 this was only 36%.
Omni-channel consistency and silos remain hurdles
Respondents see data-driven marketing as a means to an end and for several an integrated marketing cloud approach seems like the way forward from a platform perspective. Of course, Teradata is one of several major providers of such a marketing cloud (like Adobe, Oracle, Salesforce.com, IBM, Microsoft etc.).
Obviously there is a gap between what marketers would like to achieve with data-driven marketing and what they actually achieve. Many hurdles remain, among others regarding consistency in an omni-channel marketing context whereby silos are the eternal challenge, also from a campaign marketing perspective.
More about the evolutions, challenges, customer data issues (privacy, regulation, ownership) etc. in the press release, the report and the infographic below.
Given the huge attention for data-driven marketing (and, admittedly, the big marketing cloud wars we see happening), there is of course more research looking at the state of data-driven marketing and the drivers of it. In a second infographic below the one from Teradata, we’ll take a closer look at more research – with a second infographic.
Data-driven marketing: customer-centricity and customer experience first?
It’s not just the vendors and marketing cloud providers such as Teradata who come up with research.
Recently, the GlobalDMA, “an organisation that represents, supports and unites marketing associations from around the globe that focus on data-driven marketing”, looked at the state of data-driven marketing and data-driven marketing practices in 17 global markets.
It did so in collaborations with strategic consultancy for the advertising and marketing industry, Winterberry Group.
You can access the report for free on the website of the GlobalDMA (registration required). Just like Teradata, the report found that marketers – obviously – increasingly see the importance of a smart use of data in marketing, customer experience and advertising (the whole programmatic advertising concept).
In an infographic on AdWeek, infographist Carlos Monteiro summarized some of the key takeaways from that research (and from Teradata’s 2015 Global Data-Driven Marketing Survey).
How Data-Driven Marketers Outperform Their Competition - Adweek by Carlos Monteiro (Click Image To Enlarge)
Among the key takeaways of the data-driven marketing report by the GlobalDMA:
77% of marketers are confident in the data-driven approach and 74% expect to increase data marketing budgets this year.
Data efforts by far focus on offers, messages and content (marketing) first (69% of respondents). Second ranks a data-driven strategy or data-driven product development. Customer experience optimization unfortunately only ranks third with 49% of respondents.
Among the key drivers of increased data marketing: first of all a need to be more customer-centric (reported by 53% of respondents). Maximizing efficiency and return ranks second followed by gaining more knowledge of customers and prospects.
The question is what will be done with this knowledge. And here we see that 20% reports a need to align with digital consumer preferences.
So, it seems the digital customer experience does matter as it’s essential for a more customer-centric approach and the – for many in practice – extremely urgent need to align with (digital) consumer preferences or, maybe better, catch up with the changed customer reality, in the end the essence of what digital marketing transformation is all about in the first place.
COMMENTARY: Customers today expect—and demand—a seamless and relevant experience. They have grown accustomed to marketers’ knowledge of their preferences and anticipation of their needs. Fractured or conflicting messages from a brand make marketers seem unorganized and annoy customers, sometimes even driving them away.
For marketers, the only solution is data-driven marketing for individualized insights. Teradata defines data-driven marketing as
"the process of collecting and connecting large amounts of online data with traditional offline data, rapidly analyzing and gaining cross-channel insights about customers, and then bringing that insight to market via a highly personalized marketing campaign tailored to the customer at his/her point of need."
Data-driven marketing is the means that leads to the end of individualized insights: moving beyond segmentation to true one-to-one personalization in a real-time context— achieving the capability to distill insights at the individual level, and the ability to target known customers in a digital marketing ecosystem that often settles for “close enough.”
A well-integrated data-driven marketing program that provides a single view of the customer is the path to such individualization, and can result in business benefits that are just waiting to be tapped. The data in this report is based on a survey of 1,506 marketing and communications executives worldwide, representing all major industries. The survey was conducted in the fall of 2014, and follows a similar survey conducted in 2013.
According to Mark Jeffrey, Kellogg School of Management, Northwestern University, and author of "Data Driven Marketing," there are 15 essential marketing metrics that all marketers should know.
Click Image To Enlarge
People-Centric Targeting Across All Screens
What is it: Media consumption habits are more fragmented and diverse than ever, as consumers engage with multiple screens and devices. And in this multi-channel world, isolated, one-off marketing strategies and closed technologies no longer make sense. Advertisers don’t need a social plan or a mobile plan; they need to take full advantage of all available data to develop a people-centric plan that zeroes in on customers wherever they are and when they are most likely to engage. This requires platforms that can integrate data across channels and provide transparent, objective analysis of performance across every marketing touch point.
In the following schematic, Mark Jeffrey illustrates the process of gathering customer data from multiple sources, the flow of that data through the processes of validation, cleaning, transformation, aggregation and loading that data into a centralized data warehouse for business intelligence purposes.
Click Image To Enlarge
Why it’s important: People-centric marketing is even reaching the world of linear TV, where new streams of data from set-top boxes and connected televisions are making it possible for advertisers to programmatically buy specific audience segments rather than focusing only on shows and time slots, which is how TV has traditionally been bought and sold.
Data will continue to blur the lines between TV and digital, between mobile and display, and even between direct response and branding. It all comes down to the person — whether the screen they are looking at is on their wall, at their desk or in the palm of their hand.
Why it’s hard: Cross-channel engagement is the cornerstone of people-centric marketing. The challenge, though, lies in evaluating attribution — an understanding of how well channels performed in engaging targets. Several years ago, this degree of accountability was impossible to achieve because data analysis tools hadn’t yet innovated to the point of granular attribution. Today, however, multi-touch attribution platforms, driven by more recent advances in machine learning and predictive analytics, have made this a reality.
The explosion of interest in predictive analytics and in particular sophisticated predictive analytics has led to renewed interest in the topic of how to plan for an predictive analytics project.
The topic is hardly new but anyone involved in planning for, or conducting a serious analysis project needs to cognizant of the fundamentals. Here we abstract from insights and ideas from two classic discussions, the 1996 paper by Fayyad, Piatetsky-Shapiro, and Smyth "From Data Mining to Knowledge Discovery in Databases,” and the 1999 “Cross-Industry Standard Process (CRISP) for Data Mining” developed by analytical practitioners at Daimler-Chrysler, ISL (later acquired by SPSS and now part of IBM) and NCR (Teradata). While these first-generation discussions clearly date from an earlier technological era the principle ideas are still relevant and are being rediscovered and translated into modern terminology daily. Here we offer a summary of the essentials viewed from our perspective of 20 years of experience in the world of data mining and predictive analytics.
Stage #1: Business Understanding (Specification of an Objective)
To successfully conduct an analytics project it is first necessary to have a notion of what is hoped to be accomplished, described in the practical language of business or decision-making. Often the objective can be clearly stated. Examples include:
In an election campaign, identify the swing voters most likely to ultimately vote for a specific candidate (precisely what the Obama campaign undertook in 2012).
Rank the offers you might provide to a website visitor by the probability of acceptance or expected value of the response (something Salford Systems undertook on behalf of a client).
Not all projects have such a clear and focused objective and may be more generic:
How can I make my overall grocery store operation more efficient logistically?
In this discussion we have in mind the more clearly defined, and thus more narrowly scoped, projects. The CRISP approach calls this first prerequisite “Business Understanding” which as Fayyad et. al. point out, includes the identification of the objective from the point of view of the client.
Stage #2: Data Inventory and Understanding
Getting a good grasp of exactly what data is available for analysis is critical and involves canvassing both internal (proprietary) and external (publicly available or purchasable) data sources. Identifying, locating, and gaining access to potentially useful data within an organization can be challenging, and a project may be forced to proceed using less data than actually exists due to institutional constraints isolating data silos. An effort to identify all sources of potentially useful data is generally worthwhile as many large organizations have a myriad of data stores, some of which are generally unknown outside the department that collected it. External data is increasingly available via the Internet and can come from government sources and/or private organizations. Such external data can be extremely helpful in improving the predictive performance of a model and refining of the analytical insights it can yield. Examples of such data include:
Popularity of certain Internet search terms
Stock market index movements
Snowfall in Fairbanks, Alaska
Supplemental data can be of enormous value in providing the relevant context for what is tracked from the perspective of a single enterprise. For example, supplemental data may include prices charged by competitors for the same or similar products offered by a retailer.
In our consulting experience for financial services enterprises, we have generally found that the data such enterprises amass on their own customers is far more valuable than anything that could be collected or purchased about those same customers externally, but the added value of such data could still justify the cost of the acquisition of this data.
Stage #3: Assessment of Data for Suitability
Having identified objectives and located potential data the next step involves an assessment of the data with the objective of determining whether the available data can in fact support the original objective or perhaps some modified and even scaled-back objectives. One may discover that the available data is missing essential details (descriptors for the ad actually shown to a web site visitor) or that data is available only for a subpopulation (e.g. only data for the clicks was retained and all non-click information was discarded). Such data omissions make the job of predictive analytics vastly more difficult if not impossible. The topic of mishandled or mismanaged data is a large one and all experienced analysts have their own catalog of horror stories. Our point here is simply that we have to allow for the possibility that data problems may complicate the project if not stop it in its tracks.
The data may also offer too small a sample size for meaningful analysis. This may appear to be a thing of the past, something that will never occur again in the brave new world of big data. But even with huge data stores the fraction that is relevant to a specific analysis may be small.
If you need to build a click model for an ad that has only been clicked on a 15 times you are short of data even if you have one billion impressions for that ad. In sum, we need both a sufficient quality and a sufficient quantity of data.
Stage #3: Assessment of Data for Suitability
Having identified objectives and located potential data the next step involves an assessment of the data with the objective of determining whether the available data can in fact support the original objective or perhaps some modified and even scaled-back objectives. One may discover that the available data is missing essential details (descriptors for the ad actually shown to a web site visitor) or that data is available only for a subpopulation (e.g. only data for the clicks was retained and all non-click information was discarded). Such data omissions make the job of predictive analytics vastly more difficult if not impossible. The topic of mishandled or mismanaged data is a large one and all experienced analysts have their own catalog of horror stories. Our point here is simply that we have to allow for the possibility that data problems may complicate the project if not stop it in its tracks.
The data may also offer too small a sample size for meaningful analysis. This may appear to be a thing of the past, something that will never occur again in the brave new world of big data. But even with huge data stores the fraction that is relevant to a specific analysis may be small.
If you need to build a click model for an ad that has only been clicked on a 15 times you are short of data even if you have one billion impressions for that ad. In sum, we need both a sufficient quality and a sufficient quantity of data.
Stage #4: Pilot Project
During the last five years or so, all of our consulting projects have begun with a significantly scaled-down dry run or pilot project; one major year-long project we conducted in 1996 began with a 3-month scaled down trial. This is something of an innovation in the practice of advanced analytics even though the notions are anything but new. Engineers and architects build scale models, jet engines are tested in stationary mounts on the ground long before they are placed on an airplane, and new web pages and marketing campaigns are typically first exposed to just tiny fractions of a major web site’s traffic. Obvious or not, the notion that data analysts should consciously opt for a trial run may be new to some practitioners and is well worth considering. At the very least, analytics could be conducted on a radically slimmed down version of the data set to accelerate modeling run times.
Real-World Example: In a project to predict the sales of individual promoted products in a chain of large grocery stores we started with data for a few dozen products for one department, and then expanded with further selected products from other departments. When done with the pilot project we moved on to making the system work for 122,000 products.
Stage #5: Prepare (and Explore) Data
From a manager’s point of view the important point here is that the preparation of the data (“beating it into shape”) so that it can be successfully modeled may require considerable time and resources. David Vogel et. al. were the winner’s of the first (and second) round progress prizes in the $3 million Heritage Provider Network predictive modeling competition. In their paper explaining their approach and methods, about one third of the paper is devoted to documenting their preparation of the data. While some data mining and predictive analytics tools can work remarkably effectively with unprepared data (TreeNet is clearly one such tool), careful data preparation invariably leads to better, and often substantially better, performance.
Much of the data preparation is guided by rather straightforward examination of the main components of the data expected to be relevant to modeling.
Routine first Steps:
Ensuring that the data display appropriate values
Scanning for extreme values and/or invalid values
Locating oddities in the data such as unexpected gaps
Often special care needs to be devoted to handling of missing information, which might be encoded in different ways in different parts of the data. Data that has been collected over a fairly lengthy period of time may also suffer from changes in data coding conventions over time rendering the overall inconsistent. A simple example would be post codes initially stored as 9 digit zip codes and then later stored always in the ZIP+4 format. Without paying attention to such details certain aspects of the analysis will suffer.
There is nothing especially glamorous about data preparation and perhaps for this reason there are very few books or courses devoted specifically to this topic. The data cleaning process is crucial for achieving the best possible results, however, and we expect to offer a series of blog entries on this topic in the future.
Stage #6: Modeling
For the professional data analyst, modeling is the most enjoyable and stimulating part of the project. It s here that we get to show off our skills and the magic of modern technology, delivering results with modern analytical tools that were simply unthinkable using only standard statistical methods. In SPM 7.0 we offer CART, MARS, TreeNet, RandomForests, GPS Generalized Pathseeker (Lasso style regression, logistic regression, and the extended elastic net), ISLE model compression, and conventional regression which gives the modeler a fair of options to choose from.
Although modelers enjoy building and assessing models, the process of moving from first drafts to final versions can involve a fair bit of experimentation and trial and error. To accelerate this search process we recommend making use of Modeling Automationsuch as embedded in the SPM BATTERY feature.
By modeling automation we do not mean letting the machine do all the work, but in allowing the machine the assist the modeler by automatically running and summarizing a series of experiments in which variations on a theme, tailored by the modeler, are run. This allows the modeler to keep analytical servers running 24 hours a day, building model variations that can be quickly reviewed in graphical summaries.
Although any model can be treated as a black box prediction machine there are usually valuable insights that can be extracted from the model and it is worthwhile spending the time to secure them. We might learn for example that certain subpopulations of customers are unusually favorable or unfavorable for our objectives or we might learn that we can shift selling effort from one segment to another. Evaluation, interpretation, and insight are by definition the domain of the thoughtful and informed reviewer and we do not expect this stage of the project to be automated any time soon (with or without Watson).
The traditional large consulting companies have always understood this very well and have typically devoted the lion’s share of their efforts to telling the story unveiled by the technical analysis in any project. When analytics are undertaken in house the responsible department should take pains to perfect this portion of the project as well.
Stage #8: Full Project
The pilot project may yield results good enough to justify moving forward with the resulting models and insights. In this case, we would move immediately to “Deployment” below and then consider expansion of the pilot.
If the pilot is successful but the business requires the full project before deployment can be considered, then the steps above must be repeated but in the context of a much larger scope. While much should have been learned from the pilot the full project may reveal a host of new problems and challenges and several conclusions derived form the pilot may need to be revised.
Pilot projects often benefit from the luxury of allowing detailed study of a narrowly defined set of data, supporting very high quality results. A risk for the expansion of the project is that insufficient resources are allocated under the mistaken understanding that the simple scaling up of the project is routine. Salford Systems has been part of several projects in which we thought the pilot was of far higher quality than the full project in all aspects of its execution due to resource allocation.
Stage #9: Deployment
Unless a model is intended to guide only strategic thinking its real payoff comes only when it is deployed in some real-world process. Our retail sales prediction models are used to guide the logistics of a grocery store chain, dictating the number of cans of tuna fish to ship to each store in the network next week. Our ad optimization system recommends which of possibly many thousands should be displayed right now for a given visitor of a given web page. Our lifetime customer value model scores credit card applicants on the basis of their likelihood to have nonzero balances on their card 12 months from now. These models all needed to be embedded in business processes to speed and improve the quality of decisions. Without deployment, models become little more than academic exercises or illustrations of what might be accomplished.
Deployment is a complex topic deserving of an extended discussion all of its own. The only point here is that a modeling system should support easy deployment, by, for example, allowing export of models into reusable code such as Java, C, or PMML. (SPM offers all three and more).
Conclusion
Here are the stages we described once again:
Business Understanding (Specification of an Objective)
Data Inventory and Understanding
Assessment of Data for Suitability
Pilot Project
Prepare (and Explore) Data
Modeling (Build Predictive Models)
Evaluation, Interpretation, Understanding
Full Project
Deployment
We might also add a final stage: Extract lessons learned and refine “best practices” documents to guide future projects.
COMMENTARY: As consultant I often discover that my business clients have only a very general and superficial understanding of who their customers are. Many claimed they know their customers, but they are only seeing the first layer or veneer of their core customers. They lack a deeper understanding of their customers. They are surprised when they discover that their are many different layers of customers that makeup their core customers.
It is no longer sufficient to target customers based solely on demographics. Targeting has become a lot more complicated and difficult than in the past because customers are found both online and offline. They read less newspapers for their news, listen to less radio broadcasts for their music, watch less television programs than in the past and don't go to movie theaters like your parents did. We are living in the "digital age" and the internet has changed everything. Customers are spending more time online now, consuming more online content and spending more time sharing and engaging with friends and brands on social networks. Customers are shifting from the desktop to mobile devices like smartphones, tablets and wearable devices adding an additional layer of complexity in targeting them. Finally, your customers must not only be targeted demographically but by customer segments and lifestyles -- in short, how they spend their time and money.
I cannot stress the importance of gaining a deeper understanding and knowledge of your customers in order to effectively target them. Customer data is no longer homogenous, but originates from many different sources and comes in different formats, and this data must be captured, stored, validated, cleaned, transformed and aggregated, then it must be procesed through a business intelligence system for data mining purposes, online analytical processing (OLAP), data visualization, reports and management dashboards.
Data Sources and Data Flows Essential For an Efficient Business Intelligence System (Click Image To Enlarge)
As a former CFO for Solar Planet, a very successful indoor tanning salon franchise headquartered in San Francisco, I speared the effort to gain a deeper understanding and knowledge of the tanning customers of our tanning salon franchisees located across the U.S. This required esablishing a data warehouse, then going through numerous steps (see above schematic) to get our customer data in a format that could be used for business intelligence analysis and reporting purposes.
When we began this effort, we had a pre-conceived notion of what we thought our tanning salon customers were like, but when we analyzed the customer data through our business intelligence system, we discovered that our customers were not all the same. True, they enjoyed tanning for different reasons, but they were different as night and day, and this deeper understanding, valuable insights and knowledge allowed us to more accurately target them while using the most efficient media channels to reach them. Armed with this information we were able to provide our existing Solar Planet franchisees with better service and marketing advice and information, and identify the best locations for future Solar Plant franchise locations. Solar Planet was able to customize the advertising and promotions campaigns for each franchisee based on the knowledge we gained about their customers. Our franchisees were able to gain measureable improvements in new customers, increases in revenues and lower marketing expenses, and the results of each marketing campaign became much more predicable. Below is a presentation that I prepared for our management team and potential investors. I hope you find it valuable.
Recent Comments