Is There A Place for AI in Small to Medium Businesses?

Many small to medium business owners view artificial intelligence as something only huge corporations need.

In reality, it can help position them to compete with those corporations on a whole new level.

It seems like everyone in the business world is launching artificial intelligence programs.

That’s partly because nearly everyone is. 61% of businesses have already begun using some form of artificial intelligence, many of those focusing on predictive analytics and machine learning.

71% report they plan to expand their use of predictive analytics and other AI applications over the next year.

For most companies the decision to adopt AI is an easy one.

For small to medium businesses (SMBs), though, there are tough questions to answer.

Even successful SMBs don’t have the same depth of financial resources as a multinational corporation.

They need to invest cautiously, and artificial intelligence can sound like a science fiction daydream.

That’s unfortunate, because artificial intelligence is fast becoming the kind of tool that can help small to medium businesses keep up with their larger competitors.

Read on to explore the things keeping SMBs from investing in artificial intelligence. then find out how to get past them and what technologies are best suited for small to medium businesses.

Practical Artificial Intelligence

“Artificial Intelligence” brings to mind futuristic robots and complex movie plots, but the reality is much simpler.

The term refers to teaching machines to “think” and interpret information like humans do. Humans have very flexible minds.

They can handle a variety of rapidly-changing topics and navigate difficult conditions that confuse computers (although computers have a greater ability to process repetitive data quickly and accurately).

Modern artificial intelligence has come a long way.

It can’t quite mimic human thought yet, but there have been some exciting advances using AI techniques like machine learning and deep learning that show potential for nuanced processing.

The technology is proving its value as an enterprise tool, too.

There are a few common applications that some people don’t realize are based on artificial intelligence:

  • Predictive analytics, especially embedded features in enterprise software
  • Chatbots on websites or social media pages
  • Intelligent assistants in office and productivity software
  • Recommendation engines used for suggesting Netflix titles and upselling in ecommerce

What Holds SMBs Back

Even as larger companies move to wider integrations of artificial technologies, small to medium businesses are slow to adopt.

Their hesitation is understandable – after all, a failed technology project could threaten the future of their company – but it also holds them back.

The truth is, many of their concerns aren’t as serious as they think.

The issues have practical workarounds or can otherwise be mitigated through proper planning.

Here’s why the leading reasons SMBs aren’t adopting artificial intelligence don’t have to be unmovable roadblocks to progress and how they can be overcome.

AI is too expensive

Industry news reports tend to cover high-end artificial intelligence ventures done by major international corporations, with price tags in the millions (or occasionally billions).

That kind of investment is an intimidating prospect for an SMB who just needs a better way to utilize their data.

The thing is, those programs usually involve the most difficult and expensive forms of AI.

Experimental programming, complex interactions, sensitive health information, government-regulated data, huge amounts of simultaneous users, and other complicating factors raise the costs above the average for enterprise AI projects.

SMBs don’t need the same amount of scale or infrastructure. Their modest needs can be met at a much more reasonable price point.

There is no “usual” price for AI. The costs associated with artificial intelligence are based on many factors, including safety and regulatory protocols and the complexity of necessary interactions.

To build an estimate, developers will ask questions such as:

  • Does the program need access to sensitive information?
  • Is it designed to address a specific set of circumstances or is it more a broad-spectrum tool?
  • What level of interaction with humans is desired?
  • What’s the scale involved?
  • Will the AI need to perform complex actions?

Even when a full artificial intelligence program is out of reach, there are ways to integrate AI on a limited budget.

For one thing, AI is included in many enterprise software packages. Most companies already have access to some AI tools, even if they don’t realize it.

Targeting tools in email marketing software and personal assistants on smartphones are both driven by artificial intelligence.

More in-depth AI toolsets are often available with a reasonably-priced software upgrade to enterprise level from free or lower-tier accounts.

It’s work checking with vendors to see what’s within reach.

The rise of reusable code and powerful development frameworks has put small-scale custom solutions within reach, as well.

Developers have platform options for creating analytics dashboards and chatbots that makes the costs approachable for SMBs.

AI isn’t ready for enterprise because the projects fail too often

Project failure is a daunting prospect for SMBs, who usually have a longer list of desired business improvements than they have capital to spend.

They need to prioritize projects because they can’t do everything they’d like.

Investing in AI means putting another project on hold, something they aren’t willing to do when it seems like all they hear about is failed artificial intelligence projects.

It’s easy to become discouraged by high-profile AI failures or assume tools are overhyped, because some projects do fail and some tools are overhyped.

Artificial intelligence is at a point in the Hype Cycle where its applications are being rigorously tested, and some won’t make it through to becoming everyday technologies.

However, project failure is more often an organizational issue instead of a technological one.

Projects fail for a variety of reasons, most commonly:

  • A weak discovery process results in a weak final product.
  • Internal adoption rates are too low to realize the project’s potential.
  • Misaligned business goals lead to the company creating a product that no longer fits within their workflows.
  • The company experiences an outsourcing failure or developer issues.

Avoiding these issues is somewhere small to medium businesses may have an edge over larger corporations. Why?

  • Pushing internal adoption on a small team is more effective because the company leadership can personally talk to everyone (or at least every team leader) to convince them of a project’s value.
  • There is less opportunity for confusion over business needs and goals.
  • The development process has fewer moving parts, so it’s easier to make needs clear during discovery.

What SMBs need to watch out for is the tendency to default to the lowest bidder, especially when outsourcing overseas.

If they focus on quality as much as price, they’re more likely to get a quality return on their investment.

Choosing Agile development methods is another way to ensure a positive outcome.

Developers who use Agile and conduct a thorough discovery are actually seeing a rise in project success rates, and have been for a couple of years.

AI isn’t practical for a small to medium business; it only works for massive corporations.

Many SMB owners see AI as something that can’t help their business.

They assume they don’t have enough data to process or that the impact of AI won’t be noticeable at a smaller scale.

A lot of those same owners would be surprised to realize how much data they already have – data which is going untapped.

Putting that data to work might result in smaller gains, but proportionately those gains matter more.

One interesting thing about AI is that is has opposite benefits for SMBs and larger companies.

It helps giant companies operate with the personalization of an SMB while allowing SMBs to function with the efficiency of a massive corporation.

That is, it gives small to medium businesses the edge they need to “punch outside their weight class” when it comes to competing for market share.

While there are some AI applications that won’t help smaller-scale businesses, there are many more that will.

A small bookshop with five employees wouldn’t get value from predictive scheduling software, but they could see an impressive return on predictive ordering and email marketing programs.

AI doesn’t apply to this industry

There’s a perception that artificial intelligence is only for high-tech fields like software development or banking.

That couldn’t be farther from the truth. AI can be applied anywhere where data is generated – that is, everywhere – to improve efficiency, guide decision making, and maximize the impact of marketing and sales campaigns.

Some examples:

  • A cleaning company uses AI to intelligently manage their leads and upsell current clients.
  • A stroller rental company builds an AI-powered solution to manage their inventory and give customers more options for customizing deliveries.
  • A vacation rental agency uses price optimization to get the best possible pricing on rentals for owners.
  • A landscaping company decides where to expand based on data gathered from predictive analytics tools.

These are all small but important decisions, and they’re made easier using insights gathered by artificial intelligence.

AI is too hard to learn

SMBs tend to have long-time employees in leadership positions with lower turnover in mid-level roles.

They often hesitate to push something that seems high-tech or confusing due to established relationships with employees.

These fears are large unfounded. Building enterprise AI tools is complicated.

Using them is less so, especially with custom tools created specifically for non-technicians.

Most enterprise AI software is designed to be user-friendly at an operator level, so the on-boarding process would likely be much less complicated than SMBs might expect.

Where there are problems, there are well-established training solutions.

The most popular AI tools have online classes at a variety of price points, from free YouTube tutorials to subscription-based professional development platforms.

Developers generally offer training and support packages for their software at reasonable rates.

With so many options even the most technophobic staffer can find a way to get on board with new tools, especially once they realize how much easier AI makes their job.

Staying In The Game With AI

Larger companies are already investing in artificial intelligence.

As they do, they’re gaining a lot of advantages traditionally enjoyed by SMBs, like personalized service and shorter response times to changing local market conditions.

Small to medium businesses have a choice. They can make the AI investment that will help them stay competitive or risk losing their customer base to larger, better-informed companies.

At the end of the day, that isn’t much of a choice at all.

Artificial intelligence doesn’t have to be a headache. Concepta can help you build an intelligent business intelligence solution that fits your needs- and your budget. Schedule your complimentary appointment today!

Request a Consultation

Predicting Profit: Growing Revenue by Scoring Opportunities with Data Intelligence


Of all the artificial intelligence applications making their way into enterprise, one of the most effective is predictive analytics.

It has the potential to transform the decision-making process from something based on “gut instincts” and historical trends to a data-driven system that reflects actual current events.

Those who have adopted it are outperforming their competitors at every turn.

Some organizations still hesitate to launch their own predictive initiatives. Artificial intelligence has been a buzzword for decades, and many times in the past it failed to deliver on its promises.

Executives with an eye on the bottom line might avoid any kind of AI technology out of understandable skepticism.

That hesitation could be holding these companies back. Predictive analytics has matured into a viable enterprise tool. It’s being used all over the world to find opportunities for growth.

The impact is so striking that 71% of businesses are increasing their use of enterprise analytics over the next three years.

Read on to learn more about predictive analytics, what it offers for enterprise, and how it can drive a measurable increase in revenue.

What Does “Predictive Analytics” Mean?

Predictive analytics is the practice of analyzing past and present data to make predictions about future events.

Technically it can describe any process that seeks to identify the most likely scenario by drawing parallels between historical and current conditions, then placing those conclusions in a modern context.

Analysts look at what happened in the past when conditions were similar and how certain events played out. They assign more weight to factors which have tended to be more influential or which have greater potential for an extreme outcome.

Most people assume predictive analytics is a recent invention, something that arose after computers established their role in enterprise.

In reality, businesses have been relying on it since the late 1600s, when Lloyd’s of London began applying the practice to their shipping insurance estimates.

In pre-artificial intelligence days, people used statistical modelling and complex mathematical equations to perform predictive analytics.

Many updated versions of those models are still in use for industrial shipping and route planning.

Non-intelligent methods have limitations, though. They only consider factors users decide to include, so there’s a heavy likelihood of human error and bias.

It also takes time to perform the calculations. By the time they’re done and put into use the data is already becoming outdated.

The modern style of predictive analytics incorporates artificial intelligence and machine learning. It allows users to include many more variables for a broader, more comprehensive analysis.

The process highlights unexpected connections between data and weighs factors with greater accuracy. All of this can be done with a short enough timespan to create timely, reliable insights.

The Intersection of Science and Enterprise

AI and machine learning have exciting potential, but like all emerging technology they require investment. Global brands like Google might have the resources to shrug off a failed project.

For the majority of organizations, though, there has to be a reasonable expectation of profit to consider launching a technology initiative.

Successful enterprise leaders aren’t reckless with their IT budgets. They focus on actual business goals and source technology that addresses those instead of trying out popular new tools.

A smart executive’s first question about a new tool should be, “How does this benefit the company?”

Predictive analytics has an answer for that question: growing revenue through refined opportunity scoring.

The Value of Opportunity Scoring

Opportunity scoring involves assigning a value to a sales lead, project, goal, or other potential course of action in order to determine how much relative effort to spend in pursuit of that opportunity.

Scoring opportunities allows a company to get a greater return on their time and money. No company can put the same investment into every customer or chance that crosses their path.

They shouldn’t even try. 80% of revenue comes from 20% of clients, so it makes sense to prioritize that 20%.

It’s something every business does, even if there isn’t a standardized process in place. High value sales leads are called frequently, given more flexibility with concessions, and assigned better sales representatives.

High value projects get bigger teams, more access to resources, larger budgets, and scheduling priority.

The trick is deciding what opportunities have the most potential- and that’s where predictive analytics come into play.

Manual Versus Machine Learning

Two main types of opportunity scoring methods are in widespread use today: those that are done manually and those that take advantage of machine learning.

Manual scoring is where people assign scores, either personally or using a statistical model, based on their own set of influential characteristics.

There are a number of problems with this method that can slow businesses down or leave them with unreliable scores.


Opportunity scoring is incredibly valuable- when it’s reliable. Manual scores have a wide margin of error. They depend on direct user input with no ability to suggest other relevant factors.

Unlike intelligent scoring, manual methods can’t easily be used to find unexpected commonalities among high-return accounts.

The problem with this approach is that executives can’t realistically imagine or keep track of everything that might influence their company.

There’s too much data to consider, and it changes constantly. All manual opportunity scoring is therefore based on aging, less useful data. On top of that, it’s easy to make mathematical mistakes even with a computer program’s assistance.


Because users choose and weigh the contributing factors, manual scoring is highly susceptible to human bias.

Preconceptions about social categories, personal characteristics, industrial domains, and other identifying factors can be given too much (or too little) weight. It allows for unhelpful biases to be introduced into the sales cycle.

Most of the time the result is simply less helpful scores, but it does occasionally create a public relations issue if the scoring system leads to operational discrimination.

For example, some real estate brokers have run into a problem where one racial group wasn’t shown houses in a certain area. The company’s scoring suggested those groups were less likely to buy there.

The realtors thought they were following good business practices by relying on their internal customer scoring, but those scores were skewed by biases about economic stability rather than actual data.

When the situation came to light it created a public impression that the realtors had the same bias. Suddenly they had bigger worries than opportunity scoring.


Creating and maintaining a manual scoring process takes a significant chunk of time. It’s not a system that responds well or quickly to change.

That’s a problem in today’s hyper-connected world, where an event in the morning could start influencing sales across the country that afternoon. Opportunities wind up passing before they’re even recognized.

Not everyone remembers to consider “hidden costs” of ineffective processes, like higher labor.

Tedious data entry takes time and focus away from more productive sales activities without a corresponding return on value due to the lower accuracy.

There’s human nature to consider, as well. Sales teams can tell these scoring systems aren’t effective.

It’s not uncommon for teams to resist wasting time they could be working leads. They don’t like putting their commissions at risk with bad information, but still need some kind of guidance to help manage their lead interactions.

More often than not, they operate with outdated scores or “gut instincts”. That in turn frustrates managers who have invested in the inefficient manual scoring process.

The conflicting pressures create an uncomfortable working environment that drives unwanted turnover among the most valuable sales agents.


Even the most sophisticated manual scoring program can only account for things that have been specifically input into the equations.

They require humans to think of and assign value to every possible factor. This tends to enforce a “business as usual” mindset over more profitable responsive operations.

On the flip side, predictive opportunity scoring is among the leading AI-based drivers of revenue in an enterprise context. It has the edge over other methods in several areas.


There are two central reasons behind the higher reliability of intelligent scoring. First and foremost, it reduces the impact of human error.

Machine learning algorithms perform calculations the same way every time. Even as they adjust themselves in response to new data, the underlying math is more reliable than calculations done by humans.

It’s also important that predictive analytics is purely data-driven rather than focusing on “traditional knowledge” about what makes an opportunity valuable.

Artificial intelligence expands a computer’s capacity to judge the relevance of seemingly unconnected events.

Predictive analytics leverages that capacity to identify characteristics shared by highly productive courses of action. Those commonalities are then used to weigh future opportunities with a greater degree of accuracy.


Predictive analytics fed by a constant stream of new information allow predictive opportunity scoring tools to update scores in real time (or at least near-real time).

They highlight opportunities in time for companies to take action. Advance warning also leads to better advance planning when it comes to inventory, staffing, and marketing.


Intelligent scoring tools react to actual circumstances based on data instead of presumptions. They consider all data available for a situation from an impartial standpoint (providing, of course, that any developer bias is accounted for during design).


The majority of enterprise analytics software is designed to be user-friendly and easy to operate. It connects to a company’s data sources, meaning there’s usually very little data entry required. This frees the sales team to focus on their accounts and other high value activities.

Optimized Opportunity Scoring = More Revenue

Predictive opportunity scoring is backed by sound theory. The practical benefits are just as solid, with several applications proving their enterprise value today:

Lead Scoring

There’s a saying that 80% of revenue comes from 20% of clients. Lead scoring helps identify that 20% so sales teams can focus on the most valuable customers.

Agents close more contract- and higher value ones- when they know where to profitably spend their time rather than chasing a weak prospect for a low bid.

Intelligent lead scoring is a dynamic process, meaning it regularly reevaluates leads based on changing circumstances.

A client marked as a low priority lead moves up as indicators show rising interest. Agents then step in at the right point in the purchase cycle with a targeted incentive to encourage a sale (or even an upsell).

Customized attention has the added benefit of increasing repeat business, because former clients aren’t bombarded with high-pressure closing tactics when they aren’t ready to buy again.

Enterprise benefit: Better closing ratios, greater PLV, higher average order value

Targeted marketing campaigns

When executives know where advertising has the most potential impact, they have more options for strategic spending.

Specific demographics can be targeted with tailored campaigns to increase their lifetime value, keep them from falling out of the sales cycle, or meet another specific business goal.

Enterprise benefit: Higher ROI on advertising campaigns

Inventory management

Information can travel around the world near-instantly, but inventory still has to be physically moved.

That puts a natural drag on rearranging supply levels in different regions – and with it, a cap on how much companies can exploit regional fluctuations in demand.

This causes lower sales in general, but the effect is most striking during an unexpected surge in demand.

Predictive analytics can spot early indicators of a potential spike while excitement is still building.

Executives have the warning necessary to shift goods and team members where they’re needed most. Amazon uses this method to stage their merchandise for faster shipping of commonly ordered items.

Enterprise benefit: Higher overall sales revenue

Opportunities for growth

Opportunities aren’t always about marketing campaigns or scoring leads. Sometimes executives want guidance when choosing between growth strategies.

Predictive analytics and opportunity scoring are useful here as well, answering questions like:

    • Which stores will be most valuable over the next year?
    • Where should the company expand?
    • Should more franchises be sold in a specific area?
    • What ventures are most likely to succeed?
    • Is a specific project worth the investment?

There’s no guarantee that any course of action is the best, but incorporating data cuts out many of the risk factors that lead to failure (such as hype over new technology or personal bias).

Enterprise benefit: Faster, more sustainable growth

Putting Data to Work

At the end of the day, data is only valuable when it serves real-world business goals.

Opportunity scoring is one of the most proven ways to extract value from data. It’s also one of the most accessible, since embedded analytics are built into the majority of modern enterprise software.

With so much to gain at a relatively low investment point, those who haven’t adopted yet should be giving predictive analytics a closer look.

Are you frustrated by trying to navigate multiple streams of data? One of the most common pain points of data intelligence initiatives is reconciling data from the enterprise software used by different departments. Concepta can unite data from programs like Salesforce, MailChimp, and other analytics programs to put data where it’s needed most- in your hands!

Request a Consultation

Which Business Analytics Trends Can Be Put To Use Today?

business analytics trends

Originally published April 6, 2017, updated Dec. 18, 2018.

The BI technologies which offer the best chance of success today are those that allow companies to take advantage of time-sensitive opportunities while providing more responsive customer service.  

One of the most important parts of developing a digital strategy is knowing when not to jump on a high-tech bandwagon.

Some technologies show potential in small-scale trials but haven’t had enough real-world usage to prove their worth to enterprise. Adopting too early puts companies at risk of losing their investment.

On the other hand, waiting too long leaves them in their competition’s shadow.

There’s a lot at stake in this balancing act. Digital transformation isn’t a luxury anymore. It’s critical for companies who want to stay competitive.

Even chains of three or four locations can fall behind their peers if they aren’t maximizing their data usage.

Of course, building up digital infrastructure costs money. Choosing the right technology is the best way to ensure a smooth return on that investment.

A number of business analytics trends are already picking up speed coming into 2019.

Some are years away from being able to deliver on their promises. Others have reached the stage where a company can reliably use them to gain a competitive edge while side-stepping the risks inherent to early adoption.

The best of this second group are outlined below. These are the trends to adopt for enterprises seeking to improve their data agility.

Predictive analytics

Predictive analytics as a field has existed since the late 1600s, when Lloyd’s of London used it to estimate insurance rates on seagoing vessels.

Until the rise of computers, though, it wasn’t a practical means of steering business.

There were too many variables for a human to consider in time to form more than broad predictions.

Widely available cloud storage and increased processing power changed that.

The field has seen a resurgence as the most efficient way to maximize data usage and feed a data-driven decision making process.

73% of companies consider themselves to be analytically driven, and predictive analytics are behind the most successful of these.

Predictive analytics detect deviations in patterns, generate insights based on evolving activity, and predict future outcomes from gathered data.

The benefits of predictive analytics are clearly demonstrated by the variety of practical applications in use today. One unexpected example is human resources.

Retaining experienced workers is a constant challenge for employers who must cope with turnover rates of nearly 20% (averaged across US industries).

The tech sector suffers from even higher turnover. Replacing lost workers can cost up as much as half their annual salary, not counting lost productivity during the training process.

Using predictive analytics, HR managers can find patterns in their employment data that highlight the reasons good employees leave and suggest the incentives most likely to make them stay: higher salaries, additional training, more appealing benefits packages, or in some cases transfers to more engaging positions.

The data also predicts which employees are most desirable to hire and retain.

There’s still a long way to go before the full potential of predictive analytics is realized. That said, the technology is maturing much faster than experts predicted.

Its current capabilities are more than reliable enough to justify making an investment.

Real time analytics

Real time analytics (also known as streaming analytics) give enterprises an up-to-date visualization of their operations.

It was a growing trend back in 2017, and today it’s living up to that promise.

In the traditional analytics model, information is stored in a data warehouse before analysis is applied.

This causes a gap between collection and results where perishable opportunities are lost.

There’s no rule that says data has to be stored first. It can be analyzed mid-stream to sift out data that will only stay relevant for a short time.

Companies then have the chance to make the most of the opportunity through swift action.

Information gathered by real-time analytics is usually displayed in a dynamic graphic format that doesn’t require a data science degree to understand, too. That makes it easy to act on quickly.

A business that can spot opportunities in time to take action makes much greater use than those left playing catch-up on trends.

The one caveat about streaming analytics is that they work best in data-driven cultures. Be sure to provide both technical training and executive support when launching a real-time analytics tool.

Chatbots and Natural Language Processing

Natural Language Processing (NLP) has grown from an internet novelty to a reasonably robust tool.

While it hasn’t seen as much use in the corporate world as its cousin, Natural Language Generation (NLG), it has developed enough for enterprise use.

The most relevant NLP application right now is employing chatbots to provide 24/7 customer support availability. Customers can interact with a chatbot using normal, everyday language.

The sophistication of a bot varies widely. Some have very basic account support capabilities; others can guide a customer from selecting a product all the way through checkout.

At this stage of maturity users generally know they’re speaking to a chatbot, though NLP has evolved to the point where the bot doesn’t frustrate users by getting stuck or spitting out garbled answers.

Instead, bots provide a clear, straightforward path to resolving common customer issues. The convenience of having uninterrupted access to routine account services tends to negate any annoyance.

Virtual assistants also fall under the heading of NLP. These let users request analytics and services using natural language and receive replies either out loud or projected to a specific device.

There are virtual assistant integrations for a huge variety of popular enterprise programs. Some even provide a path for the assistants to complete purchases using pre-approved sites.

Looking forward

Some interesting trends are gaining traction right now. Connected “multi-cloud” strategies are maturing, and research firms like Gartner have been tracking the application of real-time analytics to automated insight generation.

For now, though, the analytics trends are the ones which have demonstrated their utility and staying power.

While nothing is guaranteed in the tech world, even tech-averse companies can expect at least a reasonable ROI from their adoption.

Are you interested in adding to your business analytics toolkit? Get a full assessment of your BI needs!

Request a Consultation

2018 Technology Predictions for Enterprise

2018 enterprise technology predictions

Staying on top in the tech world means constantly looking ahead.

It’s easier, cheaper, and more effective to integrate new technologies while building applications instead of trying to work them in later.

To that end, here’s a look at where enterprise technology is headed in 2018.

Narrow Artificial Intelligence shows ROI

Scientists have been enthusiastic about artificial intelligence for years, but excitement is spreading to the business community as it demonstrates real world value.

Artificial intelligence can generally be defined as either “general” and “narrow.”

General AI describes machines that can reason and make decisions like a human, without a set domain or task to focus on. It can shift from one task to another as easily as humans.

General AI is the “Holy Grail” for computer scientists, but no one has managed to achieve it yet.

It doesn’t help that every time technology reaches what was once considered general AI scientists realize it’s not comprehensive enough.

That task is then relegated to the narrow AI category.

That doesn’t make them useless, however.

Narrow AI applications are very competent within a single domain or when performing a set task.

That may sound limiting, but in reality it’s the only kind of AI showing ROI.

Every AI currently seeing regular enterprise use is narrow AI:

Computer vision and natural language processing (NLP) are also considered narrow AI in their present forms, though they will likely be folded into general AI eventually.

The use of narrow AI will expand dramatically in 2018.

Major companies are investing in AI, like Ford’s $1 billion Argo AI project aimed at developing self-driving car brains.

Cloud computing will push AI into the mobile realm, too.

79% of tech leaders say increasing AI usage is a priority, and 40% of digital transformation initiatives this year will include AI.

Voice and Visual Search become dominant

Technically these are both forms of narrow AI, but they’ve seen such a surge in interest that they deserve special notice.

Visual search uses an image as the basis for a query instead of text. Users can search for other versions of the same picture, similar pictures, or general information about the image.

As computer vision improves visual search is becoming more viable. While it once only let users find the origin of images, it now offers more options:

  • Find photos of friends across the internet
  • Pull up order pages for items worn by celebrities
  • Get recipes for new favorite dishes
  • Research vacation locations

Visual search is already seeing expanded use by major companies.

Google is fine-tuning its new Google Lens, which lets users take a picture with their cell phone and execute a search based on objects within the frame.

Pinterest has a Lens Your Look feature where pinners can take photos of their clothing to get style suggestions. T

hey’re also upgrading their image search to a Responsive Visual Search, where users have the option to specify what interests them within a photo.

Voice Search allows for search based on spoken commands. In the past voice search was so unreliable it inspired jokes on late night TV, but it’s improved a great deal.

In 2012 the average error rate for voice search was 20%. This year that number fell to 8%.

As the error rate has dropped, people are finding it much easier and more reliable to use. 87% of consumers think voice search is accurate enough to use.

21% of mobile users activate their voice search daily, with half using it at least once a week.

The main draw seems to be the ability to safely use smartphones while driving: 53% of those who use voice search regularly do so behind the wheel.

Because of the rising appeal of these features, expect heavy investment in computer vision and natural language processing throughout 2018.

2018 enterprise technology predictions infographic

Blockchain edges into more common enterprise usage

Blockchain is a secure form of distributed digital ledger consisting of a “chain” of individually encrypted “blocks.”

It’s managed via a Peer to Peer (P2P) Network; everyone has a copy of the chain, which makes altering or forging blocks impossible.

Blockchain is famously used for cryptocurrencies like Bitcoin or Ethereum.

There are more uses for blockchain than tracking cryptocurrencies, though.

Storing information in a blockchain is like storing it in a shared database which is being constantly reconciled.

Records are public and easily verifiable.

Once a transaction is recorded it’s incorporated into the chain, not easily erasable by any stretch of the imagination.

Blockchain also exists on a distributed network, so can’t be censored or altered.

The secure nature of blockchain ledgers has many enterprise applications:

  • Smart contracts: Digital agreements can be structured with blockchain to be fulfilled automatically when specific conditions are met. AIG is testing smart contracts for international insurance policies.
  • Preventing voter fraud: Each vote can be accounted for securely and transparently. This hold enormous appeal for civil rights groups looking for ways to prevent rigged elections.
  • Secure resumes: Create an unalterable CV that lets HR managers trust international hires. Secure resumes are especially helpful for jobs with security concerns.
  • International payment systems: Blockchain provides a method for making payments around the world without worrying about embezzlement or funds being intercepted.
  • Transparent supply chains: Secure tracking eliminates confusion and highlights trouble points in the supply process. It could improve global trade by instituting a level of accountability and trust between trade partners.
  • Secure health records: Hospitals can provide better care and cut down on waste and abuse with an unalterable medical record.

Blockchain technologies are growing in number and sophistication.

More than 2500 new blockchain patents have been filed over the last year.

Analysts predict that blockchain will disrupt the insurance and banking industries in particular.

Augmented Reality continues to mature

Augmented reality involves laying computer-generated or animated assets over a camera image. It’s already seen popular use in entertainment:

  • Pokémon Go
  • Snapchat filters
  • Star Wars Find The Force mobile app

This year, technological advances are pushing augmented reality beyond gaming. AR will find a home in other industries.

  • Retail: RayBan and tattoo shop Inkhunter both have “try on” apps that utilize AR.
  • Education: AR maps constellations over a live camera image of the sky or highlight geological features in real time.
  • Travel: AR gives translation apps a boost by overlaying translations on signs or menus.

Investment in augmented reality is rising.

Spending will double in 2018, going from $9.1 billion to $17.8 billion by the end of 2018.

Commercial use will make up 60% of that spending.

Some analysts suggest that 85% of all AR investments will be commercial (as opposed to games) by 2020.

Google, Apple and others are planning AR glasses, but mobile devices are the priority for 2018.

They offer a wider audience with lower investment costs.

Competition intensifies for technical talent

As companies push digital transformation, the talent gap will be more keenly felt.

There are 3 million more STEM positions than available workers right now.

40% of companies looking for tech talent aren’t in the tech industries, just forward-thinking companies looking for help with digital transformation.

The fields in highest demand are data science, business intelligence, digital security, and cloud computing.

There’s also a call for experienced project managers.

65% of tech leaders say that hiring shortages are holding their digital development back. That’s up from 59% last year.

How are they getting around the skills gap? 60% use third parties to support organizational bandwidth. 55% are outsourcing some or all of their analytics needs.

Expect fierce competition for experienced tech talent in 2018.

Nurturing loyalty in employees with tech skills and offering attractive retention packages will become critical to prevent losing staff to headhunters.

Predictive analytics will drive a “quantifiable” model of business processes as Big Data investments rise

Global enterprise investments in data and analytics will surpass $200 billion a year by 2020.

Companies will be looking for ways to quantify business functions where possible in order to get the best possible ROI from these investments

There is a proven advantage to pushing predictive analytics.

48.4% reported measurable results from Big Data investments.

This year, they want to spread that success throughout their organizations by setting quantifiable goals and enacting data-driven operational programs.

Application security focuses on being resilient, becoming stronger from attacks

Security threats have been a major sore point in 2017.

Ransomware attacks rose sharply. Shadow IT and casual “bring your own device” (BYOD) policies exposed companies to hacking and theft.

There was an overall 164% increase in breached records last year. Not only big corporations get hit: 43% of cyber attacks target small to medium businesses.

To meet evolving threats, security software itself needs to evolve.

The focus is shifting from brute strength protection to intelligent detection and response.

Adaptive security software should continuously monitor and scan for threats and address issues on its own.

Once the problem has been addressed, the software should make changes to fix the weak spots exposed by the attack.

That flexibility is where security companies will focus moving forward.

Security needs to be built into software from the beginning, so 2018 will see increasing collaboration between security personnel and programmers and project managers, as well.

Automation will give businesses a competitive edge in the global marketplace

Between increasing amounts of data, a fast-paced global economy, and a tech skills shortage, companies need strategies to do more with less resources.

Automation meets that need by taking tedious or routine tasks off human hands.

Highly automated companies are six times more likely to experience revenue growth of more than 15%.

Companies are responding to such a clear demonstration of potential: 54% of global companies use automation technologies, and more plan to do so in 2018.

The biggest increase will be IT and data management.

Automation will also shorten software development cycles without decreasing quality (through automated testing)

Where will 2018 take your company? If you’re looking to push digital transformation, Concepta can show you the way! Our developers have the right skills to help you visualize your data and unite your digital operations. Schedule your free consultation today!

Request a Consultation

Download FREE AI White Paper

How to Use Predictive Analytics to Forecast Sales Staff Commissions

forecast sales staff commissions

Predictive analytics is increasingly accepted as a way of improving the customer experience or optimizing supply lines, but it’s underutilized in one area: forecasting labor costs.

That goes double for sales staff that work on commission.

Managers need to be able to predict their commission expenses, but the qualities that make sales staff good at selling make them bad at predicting which deals will close.

Their optimism is a problem for CFOs trying to forecast expenses.

Enter predictive analytics, the voice of reason that brings hazy forecasts back in line with reality.

Executives can use tools already at work in other areas of the company to better prepare for the future. How?

To understand, start with the specific difficulties of predicting commissions and then see what a predictive analysis does differently.

Commissions as an accounting problem

Accounting for commissions is one of a CFO’s biggest headaches.

Although commissions aren’t paid until a sale is made, best practices require that they be included when the cost is incurred to track profitability.

There’s a surprising amount of detail involved in forecasting commission expenses.

It involves predicting not only sales but also which agents will close which sales. Most companies have a variety of pay structures to account for based on who made a sale and when; a small mistake could have a large impact on the overall budget.

Choosing an estimation model isn’t easy, though. There are a few common approaches:

  • Use the first months of a year to create a fixed monthly estimate for rest of the year. This method is easy to use but not very accurate. Fixed monthly estimates don’t account for season, labor fluctuations, product changes, and other factors.
  • Use the previous year’s monthly commissions as monthly estimates. Previous-year totals are as easy to manage as fixed monthly estimates. They’re also more accurate since they reflect seasonal influences and company-specific trends. What they miss are allowances for outside influences (market fluctuations, new competitors, supply problems) or internal change (new staff, commission structure changes, mergers).
  • Rely on sales staff predictions to project expenses. Good salespeople are often bad forecasters. 54% of deals predicted by sales staff never close because agents tend to be unwilling to admit defeat on a sale. In addition, staff paid through commissions have little incentive to accurately forecast since doing so takes up time they could be selling.

Getting answers with Predictive Analytics

Predictive analytics offer greater accuracy than traditional models.

The process begins with feeding a machine learning algorithm reams of data on customers, market fluctuations, sales staff activity, and more.

The algorithm looks for patterns and relationships between factors that may impact performance.

It then uses those conclusions to produce a tailored month-by-month prediction of commission expenses.

These predictive estimates are a game-changer.

Companies who implement data-driven forecasting have a 82% accuracy rate on a deal by deal basis versus a 46% rate for those using other methods.

In aggregate their accuracy rises to 95%, nearly 20% higher than the industry average.

Using predictive in this manner is highly efficient.

Much of the needed data is also beneficial elsewhere in the organization. For example:

  • Sales numbers help project revenue.
  • Staff performance data informs human resources processes.
  • Market factors are useful in optimizing the supply chain and spotting opportunities.

Bring the whole team on board

Predictive models don’t replace the sales staff in forecasting, but they do provide incentives for participation.

When the data they submit is accurate salespeople are rewarded with results that identify which clients are most useful, where their time can be spent most profitably, and what commissions they can expect throughout the year.

That promotes large-scale support of predictive methods within the company.

Consistent internal adoption increases the ROI on technology investments.

In short, extending predictive analytics into the accounting realm can positively affect overall profitability and performance.

Savvy CFOs should investigate how their processes might be improved by embracing predictive analytics.

An intuitive, easy-to-navigate interface makes predictive analytics accessible to everyone, not just the CIO and IT. Contact Concepta to learn about our custom analytics dashboards!

Request a Consultation

Download FREE AI White Paper

Identify Leads Who Make the Best Customers

lead scoring

Lead scoring is a key part of the marketing process, so it’s not surprising that it’s also an area where predictive analytics has a major impact.

Predictive lead scoring pushes the utility of data forward to the Sales Department in the form of actionable insights.

Let’s take a look at how this technique translates into increased revenue.

Defining Good Customers

Lead scoring is the process of assigning a number to a customer that represents their estimated potential value to the company.

The criteria used differs from company to company, but in general marketers consider a customer’s:

  • Interest in the product
  • Need for the product and position in the buying cycle
  • Ability to approve purchases or agree to contracts
  • Maximum potential lifetime value

Traditional Versus Predictive Systems

Using the traditional method of lead scoring, the CMO or another executive creates a list of traits which are perceived to increase the likelihood of conversion.

Each trait is assigned a positive or negative modifier based on how much it could affect the customer’s value.

Some examples of positive qualities are working within the company’s primary target industry, holding a high-level position in the C-suite or with decision-making authority, expressing interest in the product by filling out online requests for information, or attending relevant trade shows.

Traits that are assumed to lessen a lead’s value include holding a position with no purchase authority, a mismatched income bracket for target market, and living too far from the company’s physical stores to shop regularly.

Once all points are assigned the total is calculated, then used to evaluate whether the lead is worth passing to sales team.

The score can also indicate a lead’s priority (should it be moved in front of other sales calls or does it need to mature?).

Predictive lead scoring (sometimes shortened to PLS) is much less subjective.

An algorithm analyzes data on existing customers along with their performance, creates a formula to weigh characteristics that are demonstrated to influence customer behavior, and applies that to incoming leads to predict customer value.

The marketer doesn’t create the list of “desirable characteristics.”

Instead, the algorithm identifies the most relevant characteristics through techniques like clustering.

It estimates not only potential lifetime value but also where a customer is in the buying cycle.

The Benefits of Predictive Lead Scoring

The predictive method is finding favor with marketers, and for good reason.

It has a number of advantages over traditional lead scoring.

  • Acts on data rather than guesswork: PLS removes the bias factor in deciding what is and isn’t relevant. There is still some potential for programmer bias, but generally enough people work on enterprise software to reduce that risk.
  • Considers implicit as well as explicit data: Explicit data such as business titles and whether a lead has a company email are useful, but they only tell part of the picture. Much more can be gleaned from implicit data: posting in trade forums, downloading related publications, interacting with certain content on the company’s website. These are telling signs that a customer is moving forward in the buying cycle.
  • Works at scale: Automated lead scoring processes vastly more data than traditional methods. Companies that use it are able to adjust relatively quickly to new or expanded markets, giving them a competitive edge over their rivals.
  • Highlights optimal marketing channels: Knowing what actually matters to customers makes campaigns exponentially more effective. Marketers can target their best customers through the channels that matter most to them.
  • Suggests tactics to prevent churn: Predictive lead scoring identifies areas where customers fall out of the buying cycle so CMOs can address those weak points.

Laser-focused Lead Scoring

At its core, predictive lead scoring is about prioritizing accuracy and efficiency.

PLS zeroes in on the traits that have a measurable impact on lead performance, leading to more revenue and less hours wasted on weak leads.

How can your data help identify your best leads? Contact Concepta to learn more about our business intelligence and data science services!

Request a Consultation