POPULARITY
Brace yourself for a masterclass in app evolution as Andres Glusman, founder of DoWhatWorks shares 15 years of wisdom gained at Meetup, where experimentation was the key to their monumental growth. Discover the hidden truth that most experiments fail, but when they succeed, the impact is colossal. The secret? Thoughtful experimentation. It's about testing specific hypotheses, not just throwing everything at the wall and hoping something sticks. Experiments go beyond correlation, establishing causation – the only way to truly gauge if your product changes are working.
Relax and let someone else do the heavy lifting. That sounds pretty great to me. 3|SHARE has the best team of people dedicated to making clients wildly successful with digital transformation initiatives. 3|SHARE is an Adobe Solutions partner, Rabiah Coon, Sr. Marketing Manager discusses their specialization in Adobe Experience Cloud development, focusing on products like Adobe Experience Manager, Adobe Target, Adobe Campaign, and Adobe Workfront. They assist clients in website redesigns, integrating external tools, and maximizing the potential of Adobe Experience Cloud. They work with industries such as pharmaceuticals, e-commerce, hospitality, and manufacturing. 3|SHARE's website plays a crucial role in conveying its services and expertise, while Rabiah's marketing efforts involve content planning, blog writing, email campaigns, and data analysis. Rabiah mentions their participation in Adobe Summit conferences which gives them the opportunity to connect with Adobe professionals, gain insights into upcoming technologies, and stay updated on industry trends. While their webinar series, ‘Evolve with 3|SHARE' showcases their expertise and thought leadership. 3|SHARE's mission is to empower clients and optimize their digital platforms.
TRANSKRIPTION DIESER FOLGE DES PODCASTS Hallo! Mein Name ist Jörg Dennis Krüger und wie mein Wurst-Kabel-Trommel-Wickler am Empfang gerade gesagt hat: Ja ich bin der Conversion-Hacker. Und in dieser Conversion-Hacking Podcast Folge soll es einmal um das Thema A/B-Testing gehen. Wer mich kennt, der mich schon ein wenig länger kennt der weiß ja, dass A/B-Testing eines meiner absoluten Basis-Themen ist. Ich habe 2008 mit dem Thema, 2006 sogar schon ein Thema A/B-Testing gestartet damals für Omniture. Mittlerweile ist es Adobe-Test und Adobe-Target, das sind alte Produkte, die wir damals bei großen Firmen wie DKV, Allianz oder ähnlichem genutzt und eingeführt haben. Und das heißt, seit dem A/B-Testing und mein Buch was 2011 erschienen ist, ist Conversion-Boosting mit Website Testing heißt auch nicht ohne Grund ganz genau so. Mein Buch zu A/B-Testing Conversion-Boosting mit Website Testing, weil der Fokus des Buches doch schon sehr stark das Thema A/B-Testing ist. Ich stelle dort das Conversion-Boosting Modell vor, wie man überhaupt an das Thema Website Optimierung und Testing herangeht und ich zeige wie man testet, wie man Zeiträume auswertet. La la la la. Ich muss aber sagen, dass ich mittlerweile ein wenig weiter gelernt habe, denn es geht gar nichts bei der Optimierung so ums testen. Ich meine das Testen ist ja groß geworden durch Barack Obama, denn in seinem Wahlkampf hat er durch A/B-Testing extrem viel mehr Spenden gesammelt. Und aus dieser Spenden-Sammelaktion ist dann auch der heutiger A/B-Testing Anbieter Optimizely entstanden. Das ist im Prinzip das, was am Anfang mal irgendwie für die Obama-Kampagne gebaut worden ist. Klar hat sich mittlerweile viel verändert, Optimizely meisten hat irgendwie 80, 90 Millionen Venture-Capital bekommen, um das Tool weiterzuentwickeln und so weiter. A/B-Testing-Tools: Optimizely, Google Website Optimize & Co. So sophisticated die Software mittlerweile auch ist - der Einstiegspreis mittlerweile sehr hoch ist. Warum ich nicht mal so häufig empfehle - aber cooles Tool. So also wollten alle A/B-Testing machen, was für Obama funktioniert hat, das funktioniert ja wohl auch für mich und so weiter. Das große Problem ist, die meisten Shops oder Websites, aber bei mir geht es ja meistens um Shops, sind einfach nicht Test bar. Warum? Man hat nicht genug Traffic, denn so ein Test ist eben einfach eine ganz normale Doppelblindstudie wie wir sie aus der Medizin oder aus der generellen Wissenschaft kennen. Statistisch signifikante Ergebnisse Und damit ich in so einer Studie genug Ergebnisse habe und statistisch signifikante Ergebnisse habe, brauche ich eben genug Daten und diese Daten sind natürlich immer Besucher auf der Seite auf der anderen Seite Conversion, das sind so die beiden Hauptfaktoren die damit spielen. Und wenn ich zu wenig Besucher auf der Seite habe oder eine derzeit einfach zu geringe Konversionsrate und meistens beides. Dann komme ich auf keine statistisch signifikanten Ergebnisse, dann habe ich immer das Problem, dass ich irgendwie Daten habe, aber wenn ich das ein wenig ausrechne, sind das eigentlich alles Zufalls Daten. In den Testing Tools wird sowas dann auch als Konfidenz oder Signifikanz angezeigt. Und wenn die dann eben irgendwie nicht über 60, 70 Prozent kommt - Naja also 50 Prozent ist ein Münzwurf - 60, 70 Prozent ist nicht sehr viel besser und, wenn man dann genauer darüber nachdenkt, dann merkt man nun dass, man wirklich viele Daten braucht um wirklich verlässliche Ergebnisse zu bekommen und die auch über einen gewissen Zeitraum denn man muss mindestens 7, wahrscheinlich sogar 14 Tage testen, um jeden Wochentag mindestens einmal eigentlich mindestens zweimal zu haben. Optimaler Testzeitraum für A/B-Tests Man darf aber nicht zu lange testen um sich nicht zu viele externe Einflüsse zu viel Rauschen rein zu holen und darum sind so 2 bis 6 Wochen der optimale Test Zeitraum. Und ja, wenn man da dann nicht genug Conversions hat und genug Besuch...
In this week's episode of Analytics Neat we review six of the new features announced at Opticon 2018. What do I like about Optimizely as an alternative to Adobe Target? Which features will add the most value to customers? What is the Digital Experience Stack? All of this and more in this week's episode of Analytics Neat. Thanks for listening! iTunes: https://itunes.apple.com/us/podcast/analytics-neat/id1350608276?mt=2 Google Play: https://play.google.com/music/m/Iaeur7hjizv7s654nbcsfgtxsmq?t=Analytics_Neat Continue the conversation on Twitter (https://twitter.com/BillBruno) with #AnalyticsNeat Visit BillBruno.com
One on One with Kevin Lindsay of Adobe Target by DMN One-on-One
Kevin Lindsay, head of product marketing for Adobe Target and personalization efforts talks about bringing personalization and experiences into business. After meeting Kevin at the Adobe Summit here in Las Vegas, I invited him onto the show to learn more about the announcements over that last few days. "provide the ability, through all our marketing solutions, for marketers to be able to market to their consumers as people rather than as separate devices. Typically this is viewed as a cross-device problem. How do I take this group of devices and treat them as the person they actually represent?” Kevin Lindsay - Adobe Summit
Today I'm joined by a man who envisions a world where every business tests their ideas. He formed his own conversion optimization agency ‘Dayley Conversion’ - in 2014, which later became part of ‘Disruptive Advertising’. Welcome to DMR, Chris Dayley. On this episode of Digital Marketing Radio we discuss split-testing, with topics including: Should every business be split-testing the performance of their web pages? From the businesses that you talk to, what percentage would you say are carrying out split tests on a regular basis? What’s wrong with just launching a new version of a page and comparing the results with previous conversion rates? What if you’re a busy marketer with little dev resource. Should you still be doing split testing? What kind of tests are really good to get started with? What would you say to someone who said to you that they’ve tried split testing before, and they didn’t notice any difference? What changes that you’ve made in the past have resulted in big improvements? Is it possible to make mistakes with split testing and actually make your conversion rate worse - if so, what kind of mistakes have you seen? [Tweet ""Do #ABtesting on your site. If you've tried it in the past, try it again." @Chrisdayley"] Software I couldn't live without What software do you currently use in your business that if someone took away from you, it would significantly impact your marketing success? Hotjar [Webiste visitor testing] Optimizely [Webiste visitor testing] Visual Website Optimizer [Webiste visitor testing] Adobe Target [Webiste visitor testing] What software don't you use, but you've heard good things about, and you've intended to try at some point in the near future? Adobe Recommendations [Tweak automated product recommendations] My number 1 takeaway What's the single most important step from our discussion that our listeners need to take away and implement in their businesses? Do A/B testing on your site. If you've tried it in the past, try it again. Think bigger. Think more dramatic. And if you need to, get some outside help to get additional ideas. But if you're not testing on your website, then you are missing half of the digital marketing pie. You're just sending traffic to a site that you assume is working.
Having the right piece of content at the right time, in the right format, delivered in the right channel goes a long way towards creating great customer experiences today. Which is way it’s never been more important from both a lead generation and conversion perspective to optimize your content to create personalized, relevant opportunities to engage today’s customers. Drew Burns, Principal Product Marketing Manager for Adobe Target, a tool for testing and targeting digital experiences, shares with us why mobile devices are driving the need to use location, device type, and profile data to connect content to customers and prospects at “the moment of truth”.
Brent Dykes is the Evangelist for Analytics at Adobe and is responsible for guiding and evangelizing the vision of Adobe’s analytics solutions. He is also the author of two books: Web Analytics Action Hero and Web Analytics Kick Start Guide (Limited time, download this book for free here). Today on Digital Marketing Radio we discuss the following: What does it take to become a web analytics action hero? How do you prioritise your analytics efforts? And how do you improve the efficiency and effectiveness of your online analysis? What's more important when employing someone to work in analytics - maths or marketing? Should everyone in the marketing team be involved with analytics? Is it best to employ a specialist agency for analytics, or someone to work directly for you? What are the most important areas to be looking at initially from a priority perspective? Understanding what we want our customers to do The importance of building a conversion funnel Can you get your visitors to come back to your site and convert again? What Brent includes in his 'Web Analytics Action Hero' book and why How to become a data driven marketing team Software I couldn't live without What software do you currently use in your business that if someone took away from you, it would significantly impact your marketing success? Adobe Analytics [Website analytics tool] What software don't you use, but you've heard good things about, and you've intended to try at some point in the near future? Adobe Target [Testing / targeting software] My number 1 takeaway What's the single most important step from our discussion that our listeners need to take away and implement in their businesses? You need to trust the data. Understand what you're trying to do online, and then get the right metrics in place to evaluate those business goals that you have.