Udemy Digital Advertising & Marketing 101: Take The Complete Guide Digital Advertising & Marketing 201: Today's Trends & Topics AdThrive Basic Ad Training How AdThrive Ads Work
This is gong to be the bread and butter
Udemy Digital Advertising & Marketing 101: Take The Complete Guide Digital Advertising & Marketing 201: Today's Trends & Topics AdThrive Basic Ad Training How AdThrive Ads Work
This is gong to be the bread and butter
Guaranteed delivery. In some systems, especially in IoT scenarios, it's crucial to guarantee that events are delivered.
hm. how does apple do this? They definitely have holes.
Real-time processing with minimum time lag.
aka low latency
Pub/sub: The messaging infrastructure keeps track of subscriptions. When an event is published, it sends the event to each subscriber. After an event is received, it cannot be replayed, and new subscribers do not see the event.
GraphQL uses this setup, which is interesting. Never classified it as event driven architecture. I guess this is kind of like redux sagas as well.
Events are delivered in near real time, so consumers can respond immediately to events as they occur
Realized the other day that Apple has a huge event driven architecture with all it's cross device syncing.
An event-driven architecture consists of event producers that generate a stream of events, and event consumers that listen for the events.
Where is our event ingestion?
I know that we have listeners for certain events via our emitter class that handles consuming the events, but I'm less sure what's handle the middle.
Posts with videos earn more backlinks and see a boost in search traffic – our testing shows search traffic increases by 40% when publishers add a related video to a post.
Makes me want to start adding video to my Medium posts
For every additional 30 seconds a reader spends on the page, you can generally expect a 5% revenue increase.
Incredibly impressive here.
For maximum video revenue, run both types of players.
Interesting. So playlist is like a safe bet while related is asymmetric upside
By default, the Playlist video player sticks to the top of the screen on mobile once a reader scrolls past the player.
how recent is this change?
The Playlist video player is the top-earning video player we offer, accounting for 90% of a site’s video revenue on average, and boosting RPM by 10-20%.
Incredibly impressive stuff. What does it boost RPM in comparison to?
When you run our video players, we provide fast (and free) video hosting that plays up to 30 seconds of non-skippable* ads before each video.
How expensive is this? Why don't we have WP host it?
The Related player is a video you manually embed into a specific post or page on your site by adding an HTML snippet or WordPress shortcode into the page editor.
Why do you need to do this? Can we make this a block on Wordpress by any chance?
ngage readers and improve SEO.
how does this improve SEO?
Plays a playlist of your videos automatically on most pages on your site unless you turn it off for a specific page.
this is like a carousel for your content with ads interlaced?
Related player: Plays videos you've embedded on a post or page.
Aka their content (with an ad in front??)
The Sticky Outstream Video Player can run on any page on your site. On pages where you are running another video player, the Sticky Outstream player may serve before the reader scrolls down to a Sticky Playlist or Sticky Related player, or until they interact with a Stationary Related player.
Will both play at the same time?
Unlike the Related or Playlist video players, it does not initially appear within your content and then move to the side of the screen. This video player loads as a small video ad unit at the bottom right of the screen. Readers can close the video player if they choose.
Is this the one Andy upgraded to stick to the top of the screen?
An Oustream ad earns 10-30% more than a static display ad for the same ad slot. You’ll see Outstream earnings included with your other content ad earnings in the AdThrive dashboard.
Ok so this is more of a normal range? So 2X is huge? What is the ratio of a static content ad to a display ad earnings?
An Outstream ad only serves when it will earn you more than a regular display ad and when a reader is on a fast connection, and it waits to load until the content on your page has finished loading.
What's a normal display ad? How do you figure out that it's a fast connection?
An expandable sticky footer ad only serves when it can pay you more than a regular sticky footer ad.
How exactly do you figure this out?
Pageviews where these campaigns run earn significantly more than normal, anywhere from 2–4x higher RPM!
How much more is this than normal? What is the distribution of RPM? Normal or are the asymmetric aspects of it?
Marmalade scans for text like “sponsored by” or “brought to you by” to remove those posts from the pool that can be selected for this type of ad campaign. One caveat: those phrases need to be in plain text on the site, so it can’t recognize an image that indicates that a post is sponsored.
Interesting. Would be fun to start playing around with image to text here, though that's a lot of processing. And I guess it kind of pushes us towards google territory and crawling.
No, it doesn’t add any new ad spaces to your site. Instead of running ads for a mix of companies in all the existing ad spaces on that post, only ads for the “sponsoring” company will run on this single post for the duration of the ad campaign.
Must this be a manual direct partnership? How does Google / other ad servers handle this if it's programmatic?
Auto-Sponsored Post ads use our proprietary artificial intelligence platform, Marmalade, to help brands find posts you’ve already created that are relevant to ad campaigns they’re running. This allows that brand to “sponsor” a post on your site by filling in all existing ad spaces on that post with their ad and displaying a sponsorship message at the top of the post.
I think this might be the most interesting type of ad I've come across yet. Marmalade is fascinating.
In accordance with Federal Trade Commision (FTC) regulations, Native ads are always clearly marked as paid content, with wording saying the ad content is “sponsored by” a particular advertiser.
Why and when did the FTC get involved here? This is one of the more regulation industries I've seen, and I came from gambling.
Native ads complement your content and site design. Typical display ads aim to grab the reader’s attention and interest, while this new display type intends to function seamlessly with your content, whether that’s inside your post content on desktop or mobile, or in one of your sidebar ad slots.
How do they figure out what is "native"? Are they checking the CSS?
When they run on the page, Full Interscroller ads earn 200-400% more than an average mobile display ad. You’ll see Interscroller ad earnings included with your other mobile content ad earnings in the AdThrive dashboard.
How hard is it to separate the revenue from different ad types? I actually like these ads a surprising amount.
We’ve seen interstitial ads pay double-digit CPMs and increase RPM by up to 2% (and sometimes much more) for publishers who opt in.
That's pretty interesting how subtly different the revenue and levers work here. Surprised this meets CBA guidelines, as it seems disruptive
Interstitial ads are full-page ads that sometimes serve when your reader navigates to a second page on your site. Once the reader closes the ad, they will continue to the destination page.
Not sure I've ever encountered one of these in the wild. I can see why they are high revenue. Is this just a redirect with a follow link thrown in?
Why the name?
Resolution is the process of exchanging a token for an instance. Our container will recursively fulfill the dependencies of the token being resolved in order to return a fully constructed object.
Is this anything like Providers in React / Apollo?
The normal way to achieve this is to add DependencyContainer.register() statements somewhere in your program some time before your first decorated class is instantiated.
So do we just have a bunch of register statements in here somewhere?
The general principle behind Inversion of Control (IoC) containers is you give the container a token, and in exchange you get an instance/value. Our container automatically figures out the tokens most of the time, with 2 major exceptions, interfaces and non-class types, which require the @inject() decorator to be used on the constructor parameter to be injected (see above).
So this is kind of like a Hashmap with classes and their dependencies as the values with tokens as the key.
Singleton Each resolve will return the same instance (including resolves from child containers)
Ah ok so this is going to be "global" in a sense that once it's initialized once it's done and can be reused.
Parameter decorator which allows for a transformer object to take an action on the resolved object before returning the result.
What is a transformer object?
Parameter decorator factory that allows for interface and other non-class information to be stored in the constructor's metadata.
In what situation would you need this?
@singleton()
What exactly do they mean by singleton? I've seen this in a couple of places.
@injectable()
So this is the decorator, what exactly does it do?
Does it just allow you to pull this in with all the dependencies in place?
At some point soon, I believe, we'll start seeing DALL-E buttons show up in the interfaces of apps like Twitter, in the same way we see IMAGE, GIF, and EMOJI buttons there now.
Wild. Wonder how much money something like OpenAI has raised.
There's texting, emailing, chatting, video-conferencing, streaming, posting to message boards, updating statuses on Facebook, Linkedin, Twitter, and many other social media platforms.
Would be fun to use DALL-E to generate novel emojis and GIFs.
There are unlimited utilitarian use cases out there where the need for imagery is not so strong that a budget or the time costs of working with a professional artist or designer are warranted
Excellent way of putting, I've struggled to articulate this. I'd argue this is part of the reason there will be less cannibalization of designers. The price point is so low for some of these projects that it's effectively zero at the moment. So it's below the range of designers.
In such instances, you can use DALL-E to quickly "manufacture" components and parts for your scene
I'm most impressed by the workflow implications of these. Reminds me that these are tools and considering them as such is powerful. Just as focusing on one writing tool is limiting, so is the case with visual tools. The aim is workflows, not tools.
Thanks to DALL-E's reliance on language, anyone with existing expertise in visual concepts, styles, techniques is already well on their way to DALL-E fluency.
Interesting note here, that DALL-E is not just a triumph of computer vision, but of NLP as well.
Finally, you can use DALL-E's editing brush to erase specific parts of an image, and DALL-E will generate four new images where only the parts you've erased change.
Had no idea this was part of the deal - haven't seen a good review like this I suppose.
'll just say that DALL-E shows how racing against the machines is hardly our only option. We can dance with them too, using AI collaboratively and synergistically, in ways that radically amplify and extend our human skills and capabilities.
Love this framing, though I think it's naive given the centralization of power these are a result and product of. See Age of Surveillance Capitalism.
A writer is using it to create visualizations of characters and locations, which he then uses to create more detailed descriptions in his book.
Love this - hadn't thought of the fiction implications. Might be fun to give people prompts via DALL-E daily on Twitter or via a newsletter.
Or they employ some novel production technique to achieve a clever end, like using DALL-E to create images for use in a stop-motion video.
Or video game prototyping, a recent innovation I saw.
What's so compelling about DALL-E and its peers is how they literally illustrate AI's progress, power, and future potential in a hands-on, highly visible way.
That's a fascinating point. Interesting in that it's subtly different than solving the opaqueness problem. These are more visibly using AI but they aren't more transparent about how their tech works. That said, Midjourney is doing a decent job with this.
"Press the button – we do the rest," an early Kodak ad exclaimed. By 1900, the Kodak Brownie could be had for $1, a roll of film with six exposures cost a dime, and photography had shifted from a narrow domain of skilled professionals to a much broader one of amateurs spontaneously documenting the world as they saw it.
Well photography pushes on my "this is fundamentally different" point. Forgot how big of a jump each successive innovation in that space was.
but fewer than 20 surviving paintings are, in the words of the Encyclopedia Britannica, "definitively attributed to him."
Shocking just how few this is, especially after reading thousands of words about them in Walter Isaacson's book.
Visual expression can't exist without technology. Great artists have always been great innovators. If groundbreaking artists like Da Vinci, Pablo Picasso, Georgia O'Keefe, and Frida Kahlo were alive today, I'm sure they'd be experimenting with DALL-E.
Not sure I 100% agree with the scale and scope of this. That said, there's been plenty of great artists experimenting with the metaverse so maybe I'm off here. It feels like he's an order of magnitude off. DALL-E feels fundamentally different than previous tech.
This is an interesting article - I found this cool tool called hypothes.is that lets us annotate articles and share our annotations...