We are Gupta Media,
a digital marketing agency helping the biggest brands solve their hardest problems.

Contact Us See Our Work

We help brands, teams, artists, and more achieve success using tactics like Search, Social, and Display.

Fender
Women's Tennis Association
Industry West
Red Bull
Appalachian Mountain Club
Sony Music
Riley Home
republic records
ESRB
E3
Radio Disney
Amazon
RCA
Cleveland Cavaliers
The Governors Ball
Walt Disney Records
SVCC
0
Active Campaigns
0
Clicks
0
Impressions
Case Study

Boston Calling 2017: Driving Media and Design Behind Boston’s Major Music Festival

Crash Line Productions came to us for our fifth year driving the media and design behind Boston’s first major music festival, Boston Calling. In 2017 the festival was bigger and better, condensing its typical fall and summer festivals into one event, with a new location, killer lineup, and a major upgrade in in amenities. Our…

Read More
Case Study

Augusten Burroughs This Is How Book Launch

This is How you launch a book! St. Martin’s Press wanted to launch noted LGBT author and humorist Augusten Burroughs latest book, This is How, in a manner that captured his intensity and passion. Our creative team was given phone-quality video footage of Augusten discussing the book and the cover art. From there, they crafted…

Read More
Case Study

Capitalizing on Artists’ TV Performances

Background: A televised performance is an important moment for any artist, since it’s a chance for a national audience to see and hear an artist’s new music, potentially for the first time. Key TV performances lead to spikes in organic interest in an artist immediately following an appearance, as shown by Google Trends for Lady…

Read More
Case Study

AC/DC

Background: AC/DC is one of the most legendary rock bands of all time. Their Number One albums have stretched across multiple decades and transcended generations. However, until November 19, 2012, AC/DC’s music was only available in physical formats. On November 19, AC/DC’s entire catalogue became available on iTunes. The challenge here was to raise awareness…

Read More
Case Study

Build Up vs. Surprise Release Pros and Cons

Background: Over the past year, Gupta Media has been focused on running album sales campaigns with longer flights, including pre-order, to maximize the opportunity to reach core fans the moment that they are thinking about the product. This strategy focuses on connecting fans with appearances and promotions while keeping paid media running throughout. This is…

Read More

Featured Blog Posts

View all posts →

Term of the Day: A/B Test

View Our Glossary

A/B Testing, also known as a split test, is a method used to compare two versions of a marketing asset (webpage, ad, email, ect.)  with one varying element, to determine which performs better. With A/B Testing, half of your audience is shown the original version of your asset (version A) and the other half of your audience is shown the modified version (version B).  As your audience is served either version A or version B, their engagement levels are measured and collected. With this data, you can determine which version performed better and make careful, informed changes to your user experiences and/or designs.  A/B testing can help you determine which words, phrases, images, videos, and other elements work best when targeting your audience.

A/B Testing Process:

1. Identify goals:

What are you trying to achieve by conducting A/B testing?  Goals can be anything from generating more organic traffic to increasing product purchases and email signups. 

2: Choose what you want to test:

Determine what single element you would like to test to see how it impacts performance.  The variable you choose to change could be related to your design, wording, or layout. Make sure you choose a variable that relates to what you are trying to achieve in your goals.  For example, if your goal is to generate more organic traffic, focus on an element that will impact SEO.

3. Create variations:

You can make your desired change using A/B testing software.  This change might be switching the background color, swapping the order of elements, or changing the headline.  Be sure that you are only changing one variable so that you can evaluate exactly how effective that change is.

4: Run experiment:

Split your audience as equally and randomly as possible.  Your audience should be randomly assigned to either version A or version B and their interaction with these versions should be measured.

5: Analyze data:

Compare the measurements of audiences’ engagement levels with the two versions to determine which performed better.  If there is a significant difference in the two versions, then take action based on your results.

View Our Glossary