It’s A Marathon, Not A Sprint

I’m no runner and I definitely don’t find it fun like others do. I’m the person that goes to the gym to lift weights and work on strength but about 2 years ago I decided that I needed to challenge myself and get out of my comfort zone so I decided to set a goal of completing a half marathon. I accomplished that feat in November 2016 after a few months of training. I just recently ran my second half marathon in November 2017. As I ran that first half marathon and then subsequently prepared for the second, I realized that there were many parallels between that and analytics and marketing technology implementations. The design and implementation of an analytics technology solution that meets your organization’s needs now and later can be a long journey. That can often been seen as a problem as many organizations want it done as soon as possible. Just jumping in is a mistake You can’t one day say, “Hey, I’m going to run a 10K/half marathon/full marathon” and then go out and do it. You need to prepare and plan. You need to put work in ahead of time to get your body into the shape needed to maintain the necessary pace and distance. As the popular adage goes, failing to plan is planning to fail. Just jumping and starting to deploy data collection logic to your site without putting in the work ahead of time will lead to issues and more rework than you realize. Time and resources will be wasted and data will be far from reliable and trustworthy. It’s not...

33 Sticks Welcomes Joe Orlet

A decade ago I started working in Digital Analytics. During this time I worked with brands such as MasterCard, Home Depot, HP, Hasbro, and McDonald’s. Along with working across multiple market verticals as a consultant, my digital analytics tenure also includes experience as a product end-user and employment with a product vendor. Over the years, I observed a number of Digital Analytics efforts become mired in a cycle of implementation and re-implementation. Reasons vary between companies, however, mitigating the impact falls upon the the Digital Analytics Professional. While not applicable to every situation, I submit two general guidelines to follow: advocate implementation simplicity, target project self-sustainability.   Implementation Simplicity Implementation simplicity applies across many fronts, from eliminating overlapping products to collection of necessary data. When defining requirements consider not only business value, but implementation and maintenance cost. The more complex an implementation, the likelihood of full deployment decreases, in turn, the cost of maintenance and mistrust of data increases.   Project Self-Sustainability Partnering implementation simplicity is project self-sustainability. While complete self-sustainability is an infrequent occurrence, the product of the goal is often worth the investment. Foremost, clearly document the business logic around data collection and error handling. This business logic allows breaking away from the spreadsheets full of URLs and corresponding variables.   Self-sustaining implementations forgo rigid spreadsheets, rather relying upon rules defining variable structure and format. Often with the addition of simple programmatic logic, implementations adapt as digital properties change. While relinquishing a certain level of control, it allows the implementation a safe degree of self-management.     Trying to incorporate these two basic guidelines often deliver a...

The Dreaded Question: “Why Doesn’t This Metric in System A Exactly Match System B?”

Imagine if you will, a prospect hires you to come in and design a new analytics implementation using a new tool. Everything is going great and you’re getting all the information you need from the stakeholder interviews you’ve conducted. You’ve identified all the systems in play, including the incumbent analytics tool that is going to be replaced. Your client continuously tells you how error prone and incorrect it is. You complete the initial release of the new analytics implementation and start working with your client on the maturity roadmap to continuously enhance their new implementation. You’re making progress, but after a few months some users start to question why the numbers with the new implementation are different from the previous system. The new implementation has been tested thoroughly and there are no major issues. You reassure your client that the numbers are accurate. Even though they called the former system error prone they insist in comparing the new to the old and since the new system generates different numbers, it must be wrong. This seems like the perfect episode of the Twilight Zone. It’s something that is encountered frequently with some variation. This is something that I’ve experienced in the past and still experience today. No matter how much you confirm the new analytics implementation is accurate, your client still defers to the previous one. There are multiple reasons for this. They had used it for so long they were used to the information that was coming from it. They made decisions based on the information. Anything different must be wrong. It’s a Trap! I was involved in a...

Adobe Analytics Amazon Alexa Skill

If you are already familiar with the Adobe Analytics APIs then creating an Adobe Analytics-Amazon Alexa Skill is a pretty simple process.     Building Alexa Skills with the Alexa Skills Kit STEP 1: Create a New Skill If you haven’t already, your first step is to create an Amazon Developer account. Once you have an account, then you can create your first skill. From the Alexa Skills area, click ‘Add a New Skill’ STEP 2: Skill Information Name: This is the name of your Skill should you choose to deploy it to the Alexa App Store. Invocation Name: This is the name a user will use to call your Skill e.g. Alexa ask Adobe…. Application Id: This is a unique identifier for your application and is use as a check, within your endpoint script (more on that later), to ensure that the service calling your script is your Alexa Skill and no one else.   STEP 3: Interaction Model   The Interaction Model defines how users will interact with your skill. Intent Schema The Intent Schema is a JSON Object that defines the key words and phrases that are spoken by a user and how it maps to your Skill. For this example, we are building a very basic skill that simply accepts a reporting time period as an input. We accomplish this by defining an intent, GetDate, and a Slot that will hold the key reporting time period values. The Slot is defined using a name, ReportDate, and in this instance a list of valid dates for the Skill defined as LIST_OF_DATES.  Custom Slot Types In this example, we are using a custom type...

Is it possible to over-optimize how you use your TMS?

The short answer is yes. We’ve talked before about how important it is to have a clearly defined methodology for how you and those in your company will use your tag management system (TMS). We’ve worked with clients who didn’t take that step and what happened was they ended up developing a point solution approach. They created a new deployment condition or rule for every new element of data they wanted to collect or marketing vendor they wanted to deploy. Un-optimized set ups like this lead to management nightmares and are very difficult to maintain. Instead of rehashing a conversation we’ve already had about optimizing how you use your TMS, this conversation is meant to warn you of going too far to the other side and over optimizing the use of your TMS. One condition to rule them all. We’ve recently worked with a couple of clients where their analytics implementations were deployed through a TMS and when we started looking, we saw that there was a minimum number of rules to collect data for fairly robust analytics reporting. As we dug deeper we found that each rule was designed to handle multiple conditions and variations. On the surface the analytics implementation looked clean and streamlined. It looked easy to manage as part of a TMS configuration that was also responsible for deploying marketing vendor tags. As we began to look under the hood, however, it became apparent that it wasn’t very clean. These few rules relied heavily on custom JavaScript to evaluate what the visitor was doing and generate the required analytics tracking. These rules were also rife...

Don’t Forget the Human Element

It happens to the best of us. We get deep into a client engagement and become laser-focused on the deliverables along with all of the details and tasks that go into ensuring those deliverables are accurate, on time, and most importantly, valuable to the client. We begin to lose sight of the strategic side and become focused on the tactical. This shift comes from the best, most well-meaning position; caring for the client and ensuring value is being delivered. It’s a trap we’ve all fallen into at one time or another. When we lose sight of the strategic, we often lose sight of the human element. We become focused on tasks on a list, dates on the calendar, or the number of emails piling up that need answering. On the opposite end of those emails and tasks are other human beings. People who have the same goal as you, delivering value and successfully completing a project. They’re not just responses that need sending or boxes that need checking. Building a rapport. We at 33 Sticks pride ourselves on making rapport building one of the key aspects of our engagements with clients. By building a rapport we’re able to get a better feel and understanding for our clients’ businesses so we can tailor our engagement to solving actual business problems and not just ticking items off a list and billing hours. That’s a key way we provide our clients value. We’re also remote at 33 Sticks which may seem like a limitation at first, but actually enables us to provide unique benefits to our clients, such as not being tightly...