Is Trendy the new Traditional?

At the end of every year, we’re treated to a seemingly endless stream of articles, surveys, research papers, podcasts, etc. identifying the big marketing and technology trends for the coming year.  You might say it’s become a tradition to participate in this prediction market; one that’s further compounded by the start of the Consumer Electronics Show in January where companies and techies gather to show off semi-viable products that might be years away from market (I’m talking about you flexible cell-phone screens).

More than a few of these predictions have been made before, sometimes many times…

Of course, 2016 was no different, and the consensus seems to be that we’re poised to experience mainstream Artificial Intelligence, a personal assistant powering the connected home, complete with fridges that restock themselves — with sponsored products of course.  On perhaps a more pedestrian note, we’re also expected to see the continuous rise of video as the preferred marketing medium, ideally played on the large touch screen in your new self-driving car.

If you follow the market closely, however, you’ll note that some of these predictions have been made before, sometimes many times.  Artificial Intelligence is relatively new as a hot topic in marketing, but even mainstream media outlets like the Washington Post identified it as the hottest new tech trend back in March of 2014, noting Google’s search results and Facebook’s social graphs as key areas to expect an impact.  Likewise, the so-called “internet of things” (ie the connected home) has been promising to disrupt everything for at least the same period, if not longer.  And, video, well, it’s been a trend since before YouTube.

Where should cutting edge marketers invest their precious marketing dollars in a rapidly evolving and constantly expanding landscape?

This isn’t to suggest that amazing things haven’t developed during this period; or that these predictions aren’t valuable in their own right.  It’s always a good idea to keep your eyes on the road ahead in both the marketing and technology worlds.  At the same time, we should exercise the proper caution before funneling our budgetary dollars after the next new thing.  Many of these trends never come to fruition, or, when they do, they take a radically different form than originally imagined.

Consider wearables:  They’ve been a perennial hot topic since long before Apple launched its first watch in 2015, yet the truly market dominating sales like we’ve seen with phones and tablets have yet to materialize for any manufacturer.  As a result, they represent a potentially valuable niche, especially if you’re in fitness or healthcare services, but not something every company needs to invest in right now.

Users are beginning to congregate around their preferred communication mediums and applications while explicitly avoiding others…

The real question then becomes:  Where should cutting edge marketers invest their precious dollars in a rapidly evolving and constantly expanding landscape?  In my opinion, the answer is to continue utilizing a traditional research driven approach.  Core principles still matter regardless of the medium, and the fundamentals of identifying your target audiences and ROI objectives are even more important than ever in navigating an increasingly fractured media universe.

If the trends are any indication, fractured is probably the best way to describe it, especially when not every demographic is embracing every trend and users are beginning to congregate around their preferred communication mediums and applications while explicitly avoiding others.  For example, video and it’s hottest broadcast platforms such as Periscope and Instagram are very popular among younger, more urban demographics, but experienced consumers are likely getting their content from other sources.  Generation X and older tend to be on Facebook or YouTube, where the type and length of the message is likely to be very different.

With that in mind, I would suggest returning to a few key principles before you spend a single, hard-earned marketing dollar:

  1. Clearly identify your target audiences and the story you want to tell before worrying about the methods and mediums.  Good storytelling techniques are universal and timeless; they should remain the foundation of your campaigns, and can be adapted to meet the needs of current trends.  In others words, don’t let the medium dictate the message.
  2. Do your homework before planning your spend.  Know where your consumers are, how they are consuming media, and even when they are most receptive.  Also, remember that consumers themselves are feeling pressure from the fractured media landscape as they determine where to spend their limited attention and leisure time.
  3. Don’t be afraid to experiment.  Big data is susceptible to mis-interpretation, and even the best research can return false-positive results.  Therefore, it’s always a good idea to set aside a portion of your spend to try something different.  If you’re a company that prides itself on being on the bleeding edge, go ahead and build that app even if the audience isn’t as large as your social presence.  Things change fast, and its good to be prepared.
  4. Report on your results with extreme prejudice.  If there is one truism among the trends, it’s that things change faster than ever and audiences move quickly from platform to platform.  You need to carefully measure, analyze, and re-imagine your results to keep up.  Instagram isn’t even a decade old and it’s upending video marketing as we speak.  Twitter just crossed the ten year milestone and has already changed political marketing forever.

To paraphrase Ferris Bueller:  It’s true that marketing moves very fast, and if you don’t stop and look around once in a while, you could miss it, but that doesn’t imply that we shouldn’t remain skeptical of a new trend becoming tradition overnight.

Share This:

Ending the Year Thankful for LinkedIn and Modern Tech

At the risk of dating myself a bit, my stepdaughter was on the couch earlier this month doing some school work while her mom and I were whipping up dinner in the kitchen.  She didn’t seem to think anything was particularly special about the ability to earn college credit from a chaise, wirelessly connecting to the home and ultimately her school network via a thin and light Macbook Air running on lithium-ion power.  This is the brave new world she grew up in, and technological wizardry doesn’t even warrant much mention any longer.

I compared that to my own adventures a generation prior:  Multiple trips to the library, possible citations scribbled on dozens of index cards…

It turns out that she was working on a dreaded Research Paper, and she proceeded to remark how difficult that kind of assignment can be.  Personally, I hadn’t thought about Research Papers since college myself — I believe my last one was 37 or so pages on Bruce Springsteen for a seminar on the United States since 1945; he single-handedly saved the world from disco, can he show us how to save ourselves? — and, after casting my mind back to my own experiences with the genre, I was inclined to agree that they were perhaps a unique form of scholastic torture.

Then it occurred to me that my generation couldn’t produce such a paper laid out on the couch, using only a notebook computer.  I asked her if she’d gone to the library, to which she replied that was no longer necessary, the school has an online resource center you can search from anywhere.  I asked her if she needed to do footnotes with page numbers, but, no, they only need to include a link.  I asked her how she organized her references, and she appeared to have no idea what I was talking about.  You just needed to copy and paste, all in one magic, infinitely editable Word document.

By any measurable standard it was several orders of magnitude more difficult, but isn’t that the way of the modern world?

I compared that to my own adventures a generation prior:  Multiple trips to the library, possible citations scribbled on dozens of index cards whether they would be used or not, an outline or else you had no idea what you needed to use, and all this before we even sat down at a typewriter.  Compared to our parents, we were the lucky ones because the typewriter was electric and had corrective tape built in!  In college, we were thankful to ditch the typewriter for a computer, but the preamble remained the same, or perhaps worse as the NYU library was across Washington Square Park on a cold winter’s night.

In any event, it was absolutely nothing like the process my stepdaughter followed, and by any measurable standard it was several orders of magnitude more difficult, but isn’t that the way of the modern world?  For all the incessant media coverage of the potential pitfalls — cyber bullying, fake news, an AI apocalypse,  or a President-elect addicted to Twitter, to name a few — the simple truth is that technology has greatly improved the great majority of our lives, and I don’t think we really want to unwind the clock.

In my opinion, technology — including social platforms like LinkedIn — has been incredibly freeing for the human spirit and imagination…

Personally, I have no interest in manually balancing a checkbook, frequently getting lost and consulting paper maps or asking for directions, getting stuck on the side of the road and waiting for someone to help, being limited to a few network channels, interpreting the scribbles that passed for handwriting on a written letter, or even keeping track of everything like phone numbers and addresses, photographs and recipes, on endless pieces of paper with no back up.

In my opinion, technology — including social platforms like LinkedIn — has been incredibly freeing for the human spirit and imagination — if only because we spend less of our precious and limited time on this Earth doing the mundane while simultaneously having access to much more of just about everything than ever before.  This includes the ability to connect with people and maintain relationships we would have simply lost in an earlier era.

We should ask ourselves what function does most technology, or at least most widely used consumer technology, actually perform?  What need does it fill?

Is there a downside?  Of course, that is always the way of human affairs, no sooner was the printing press invented than someone started complaining about the masses having access to too much information.  If there’s one constant in human nature, it’s a fear of change, but, much as things have changed since the Reagan-era of my youth, we should ask ourselves what function does most technology, or at least most widely used consumer technology, actually perform?  What need does it fill?

I would argue that it enhances and improves the constants of human nature, the things we’ve always valued and will continue to value far into the future:

  1. Communication and access to information.  We’ve been described as storytelling animals, and we now have access to more stories and a greater ability to share our stories than ever before, not to mention the ease of contacting help in an emergency.
  2. Friends, family, and other relations.  Social media has made our networks larger, more diverse, and more stable than ever before.  While we might not spend as much time on each individual relationship, we now have access to more varied people and more points of view.
  3. More time to think, hope and dream.  Not only is it far easier and less time consuming to organize your life, chances are your life will be longer thanks to things like improved car safety systems.

In short, as we end the year, let’s take a moment to celebrate LinkedIn and other technologies that were unheard of a generation ago, and acknowledge the incredibly positive impact on most of our lives.  I assure you that there will be plenty of time in 2017 to worry about what all this has to do with marketing trends and how to stand out in a fractured media landscape.

 

Share This:

Preproduction Blends Into Production + A Logo in Search of a Script

In the grand tradition of Friday the 13th, we’ve got a working draft of a logo for the new film project, Master Pieces, and starting our own traditions we’ve officially entered production as of December 13, 2016, but we’ve yet to finish the script.  The situation isn’t as dire as it sounds, however, as the script is now on beta version .9 and is expected to be wrapped up soon with full production continuing after the holidays.

In the meantime, the latest version of the logo and some screen grabs of sample production footage or below.

Master Pieces Sample Shot
Horror films generally play with the audiences expectations of movie making in general. For example, by filming establishing shots with a POV feel.
Master Pieces Sample Shot 2
In these two shots, we explore the primary location using a hand-held camera.
Master Pieces Sample Logo
The first draft of the Master Pieces logo. It is sure to change, but should give you a sense of the direction we are taking.

Master Pieces is a retro-horror film inspired by the classic slasher and haunted-house movies of the 1980’s, think thrills and chills on a low budget.  An official website, supporting information, and the making of series will be coming soon, but please check back here for updates until the official announcement.

Share This:

Posthaste Preproduction for “Master Pieces”

The new film, Master Pieces, has officially entered preproduction with a series of test shots designed to experiment with lighting, space, and potential placement of actors.

As this is guerrilla filmmaking, we will be moving fast and furious from here:  Official production is scheduled to start next month, and we don’t even have a final script or cast yet, but that’s all part of the charm for this throwback to the classic 80’s horror films.

Stairway Sample Lighting Set Up
Guerrilla filmmaking requires fast set ups that still establish an appropriate atmosphere. In this un-retouched still, we see how directional lighting and straight framing can be used to create the appropriate mood.

 

Drain Test Shot
Creating a haunted house without spending money on effects will require creative set-ups featuring haywire household items and appliances. What’s more annoying than a drain that won’t stop dripping?

 

Shaving Test Shot
The overall mood will be further established by framing shots with multiple viewing planes. In this sample, we see a figure close to the camera and a master bedroom while we peek into the bathroom.

More updates will be coming soon, or at least they better be or we’ll be falling way behind schedule.  In an ideal world, there will also be a YouTube Series and a full production log, Making Our Master Pieces, both of which will ultimately serve as ideas for a cursed sequel like Scream 3 and Nightmare on Elm Street: New Nightmare.

Of course, we’d be ahead of them by several films if we make the making of with the original picture, but it pays to be efficient, especially in low-budget slasher and haunted house flicks.

Share This:

Salesforce Einstein vs The Real Deal

The AI craze is officially upon us with the majority of marketing and consumer technology providers promoting some form of advanced machine learning, and some of the leading minds in business and academia like Elon Musk and Stephen Hawking warning of an imminent computer revolution right out of James Cameron’s Terminator film-series, or perhaps more topically after the season finale last night and my post last week, Jonathan Nolan and Lisa’s Joy’s Westworld.

One of the latest entries in the rapidly evolving space is Salesforce’s Einstein, an AI that the leading CRM and cloud solution provider claims is “built into the core” of the Salesforce platform.  Einstein promises to be your own personal data scientist and assist with discovering insights, predicting outcomes, determining next best steps, and automating tasks.  The language used to describe Einstein is very human, a far cry from the antiseptic modeling language of earlier incarnations of similar software, and he even has a cute little graphic representation.  Is the revolution already upon us?

Computers are barely scratching the surface of a small portion of human talent…

As usual in the marketing world, one needs to separate fact from fiction, and the truth lies somewhere in between.  It’s a fact that big data, ridiculously fast processing power, and increasingly sophisticated algorithms are completely changing our perception of what computers are capable of, and that some of the latest and greatest software and devices can seem intelligent at times.  It’s also a fact that this new technology has real benefits for marketers as they seek to engage in truly personalized interactions with their visitors.

Thanks to solutions like Einstein, the same types of technologies that power a brief exchange with Siri that resembles a real conversation or Google’s predictions about where you’re headed on a road trip, can be applied to your marketing campaigns.  This offers your organization the benefit of ever more sophisticated and targeted models to uncover new relations among your customers and prospects, and the ability to improve engagement by personalizing interactions in real time.

These are truly amazing developments in the marketing world, and they will change the way we plan campaigns, segment customers, and measure ROI, but, getting back to the original topic, is any of this truly intelligence in the human sense of the term?  Or, more dramatically, do we need to be worried about the future of the human race if we set Einstein loose on our target audiences?

For example, a Presidential election forecasting model that…well, there’s part of the problem in a nutshell…

That’s a far more difficult question to answer, partially because intelligence itself is very difficult to define, and there are a lot of behaviors — memory, creativity, intuition, reasoning, etc. — that are usually blended together when we’re talking about the general subject of human intelligence.  No offense to Mr. Musk, Mr. Hawking, Mr. Cameron, or Mr. Salesforce,  but I believe computers are barely scratching the surface of an exceedingly small portion of human talent, and that a lot of what we are seeing with the AI craze is anthropomorphic language over-selling a certain type of data-driven analysis.

At its most extreme, I think it is fair to say that current technology is capable of (very) limited forms of what we would traditionally call deductive and inductive reasoning, though even that is a stretch given that the algorithms that actually draw conclusions from the data are written in advance by human programmers.  In the deductive case, the appropriate algorithm and associated databases are designed to draw conclusions based on multiple premises.  For example, I often go to my mother’s house on the weekend, therefore if I am traveling on the road I usually take to her house and it is a Saturday afternoon, Google assumes that’s my destination; Google might even give me the time to her house in advance every Saturday morning.

The inductive case is a bit more complicated.  A  large amount of premises that are either true or mostly true are turned into specific predictions.  This is usually done with the assistance of a living and breathing data scientist, where the computer crunches the numbers and the human reviews the output.  For example, a Presidential election forecasting model that…well, there’s part of the problem in a nutshell.  Models are nice and neat; the real world is a lot more messy and difficult to capture in terms of the discreet data points that feed both artificial intelligence approaches.  While a certain marking persona has the propensity to be a new customer, that doesn’t mean that an actual person will be one.

It’s dealing with this very messiness via the human capacity for intuition and ingenuity that has yet to be replicated by a machine, and — in my opinion — remains an essential part of true marketing intelligence, but before we return to the marketing world let’s consider the real Albert Einstein, usually known as one of the most brilliant minds to ever walk the Earth.  How does his Salesforce-branded AI mascot stack up?

While a certain marking persona has the propensity to be a new customer, that doesn’t mean that an actual person will be one…

I think most scientific historians would acknowledge that Einstein’s greatest gift was the ability to make incredible intuitive leaps, to combine completely unconnected data in entirely new ways that ultimately invalidate (some of) the old assumptions and data. In Special Relativity, it was assuming that the ether didn’t exist at all and there was no privileged frame of reference.  In General Relativity, otherwise known as gravity, it was the idea that acceleration and gravity are the same.

Furthermore, the key insights often came from simple thought experiments.  For General Relativity, Einstein imagined an individual in outer space in a windowless elevator, if the elevator was being pulled upward at the same acceleration as Earth’s gravity exerts downward, the individual would have no means to determine if he or she was accelerating or in a gravitational field.  He also imagined that if you were to jump from a very high building along with some office supplies, everything would fall down at the same rate and — for a while at least — it would seem that you weren’t subject to the effects of gravity and that you were in free fall in outer space, therefore acceleration can also cancel gravity’s effects.

This kind of insight and imagination — or dare we call it storytelling — isn’t subject to reason or modeling until after a living, breathing Einstein makes the mental leap.  In Einstein’s case, these leaps lead to assumptions that lead to some of the most powerful ideas in human history.

Identifying and engaging customers without us worrying about the fate of the world…

In short, this isn’t the kind of conclusion Salesforce’s Einstein is going to make anytime soon.  While the revolution will likely be televised, it’s still several generational leaps of technology in the future.  In the meantime, we can safely enjoy the benefits of big data and machine learning as they make our personal and professional lives easier.

In other words, let Google keep helping us avoid traffic, and the new Salesforce Einstein identifying and engaging customers without us worrying about the fate of the world.  Let’s also be sure to remember that marketing is both art and science; machines can help round out the science, but humans are needed to develop truly engaging stories and exercise creative judgment on how best to connect with other humans.  This is true for marketing topics large and small; we can discuss further over my next post, Google vs Google:  Do You Need Best Practices for Your Best Practices?

Share This:

HBO’s Westworld and Your QA

Fans of HBO’s newest hit series Westworld might love it for the suspense and intrigue, but technology and marketing geeks can also appreciate the emphasis on quality assurance at the sci-fi theme park.  In the world imagined by co-creators Jonathan Nolan and Lisa Joy, QA is a department that shares equal power with the development teams in Behavior.

Elevate the role of quality assurance staff…

It’s an interesting take on the oft-repeated notion that quality can’t be inspected into the end of a project, sometimes known as the “shift left philosophy.”  Real-life companies can take a lesson from the show by considering organizational structures that elevate the role of quality assurance staff and view them as equal partners with developers and other implementation teams.

This partnership approach will help in two critical ways.  First, it will highlight quality needs from the first to the last stages of every project.  This simple structural change should not be underestimated — simply ensuring that there is adequate time in a project plan for proper testing will be a big help for some organizations struggling with quality.  Second, and perhaps more importantly, projects will benefit from a broader diversity of skills.

QA professionals can offer a different perspective than project managers, developers, designers, etc.  This helps improve the overall results by broadening the resources available and integrating fresh ideas early in the project lifecycle.  It will also help ensure that the specifications and other early-stage assets are fully in line with the user requirements and the overall goals of the project.

Following the proper testing scripts might not prevent a robo-apocalypse…

Of course, Westworld also indicates a few potential pitfalls for either not listening to your QA team or bypassing testing protocols entirely.  In the early episodes. the initial impetus for the story was the premature deployment of the enhancement that allowed the hosts to experience “reveries.”  Anthony Hopkin’s Dr. Ford character inserted a few lines of code immediately prior to release and the full regression testing wasn’t performed.  While most of us are managing websites, apps, or marketing campaigns instead of  potentially murderous robots, the lesson should still resonate:  Nothing should be deployed without following all applicable testing protocols, and even small, seemingly innocuous changes to code can have negative repercussions.

While following the proper testing scripts might not prevent a robo-apocalypse, it can certainly improve the experience of your customers.

Share This:

Welcome to my Website

This is the official inaugural post of www.christiantwiste.com, dedicated to all things Christian, and soon to be filled with musings for which I’m not remotely qualified, but I’ve always liked a challenge.

Share This: