Funspinparty https://funspinparty.com/ Mon, 12 Aug 2024 10:19:40 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.1 Not-for-profit ad group GARM shuts down after X lawsuit https://funspinparty.com/not-for-profit-ad-group-garm-shuts-down-after-x-lawsuit/ https://funspinparty.com/not-for-profit-ad-group-garm-shuts-down-after-x-lawsuit/#respond Mon, 12 Aug 2024 10:19:40 +0000 https://funspinparty.com/?p=72396

This comes just days after Musk said in a post that the company ‘tried being nice’ for two years and got ‘nothing but empty words’, adding ‘now, it is war’.

Read more: Not-for-profit ad group GARM shuts down after X lawsuit

]]>
https://funspinparty.com/not-for-profit-ad-group-garm-shuts-down-after-x-lawsuit/feed/ 0
ASML’s latest machine powers new breakthroughs for logic and memory chips https://funspinparty.com/asmls-latest-machine-powers-new-breakthroughs-for-logic-and-memory-chips/ https://funspinparty.com/asmls-latest-machine-powers-new-breakthroughs-for-logic-and-memory-chips/#respond Mon, 12 Aug 2024 10:19:03 +0000 https://funspinparty.com/?p=72394 Imec, a leading semiconductor research company based in Belgium, today announced a series of chipmaking breakthroughs at its joint lab with ASML. The lab opened its doors in June with the aim to provide ecosystem partners with early access to the High NA EUV prototype scanner. The High NA machine represents the latest advancement in extreme ultraviolet (EUV) lithography systems, which use light to draw chip patterns on silicon wafers. It’s ASML’s most high-end tool to date. Now, imec says that the use of the technology has already yielded impressive results. The first is the successful printing of circuit patterns…This story continues at The Next Web

]]>
https://funspinparty.com/asmls-latest-machine-powers-new-breakthroughs-for-logic-and-memory-chips/feed/ 0
Understanding & Optimizing Cumulative Layout Shift (CLS) via @sejournal, @vahandev https://funspinparty.com/understanding-optimizing-cumulative-layout-shift-cls-via-sejournal-vahandev/ https://funspinparty.com/understanding-optimizing-cumulative-layout-shift-cls-via-sejournal-vahandev/#respond Mon, 12 Aug 2024 10:18:34 +0000 https://funspinparty.com/?p=72391

Cumulative Layout Shift (CLS) is a Google Core Web Vitals metric that measures a user experience event.

CLS became a ranking factor in 2021 and that means it’s important to understand what it is and how to optimize for it.

What Is Cumulative Layout Shift?

CLS is the unexpected shifting of webpage elements on a page while a user is scrolling or interacting on the page

The kinds of elements that tend to cause shift are fonts, images, videos, contact forms, buttons, and other kinds of content.

Minimizing CLS is important because pages that shift around can cause a poor user experience.

A poor CLS score (below > 0.1 ) is indicative of coding issues that can be solved.

What Causes CLS Issues?

There are four reasons why Cumulative Layout Shift happens:

Images without dimensions.
Ads, embeds, and iframes without dimensions.
Dynamically injected content.
Web Fonts causing FOIT/FOUT.
CSS or JavaScript animations.

Images and videos must have the height and width dimensions declared in the HTML. For responsive images, make sure that the different image sizes for the different viewports use the same aspect ratio.

Let’s dive into each of these factors to understand how they contribute to CLS.

Images Without Dimensions

Browsers cannot determine the image’s dimensions until they download them. As a result, upon encountering an<img>HTML tag, the browser can’t allocate space for the image. The example video below illustrates that.

Once the image is downloaded, the browser needs to recalculate the layout and allocate space for the image to fit, which causes other elements on the page to shift.

By providing width and height attributes in the <img> tag, you inform the browser of the image’s aspect ratio. This allows the browser to allocate the correct amount of space in the layout before the image is fully downloaded and prevents any unexpected layout shifts.

Ads Can Cause CLS

If you load AdSense ads in the content or leaderboard on top of the articles without proper styling and settings, the layout may shift.

This one is a little tricky to deal with because ad sizes can be different. For example, it may be a 970×250 or 970×90 ad, and if you allocate 970×90 space, it may load a 970×250 ad and cause a shift.

In contrast, if you allocate a 970×250 ad and it loads a 970×90 banner, there will be a lot of white space around it, making the page look bad.

It is a trade-off, either you should load ads with the same size and benefit from increased inventory and higher CPMs or load multiple-sized ads at the expense of user experience or CLS metric.

Dynamically Injected Content

This is content that is injected into the webpage.

For example, posts on X (formerly Twitter), which load in the content of an article, may have arbitrary height depending on the post content length, causing the layout to shift.

Of course, those usually are below the fold and don’t count on the initial page load, but if the user scrolls fast enough to reach the point where the X post is placed and it hasn’t yet loaded, then it will cause a layout shift and contribute into your CLS metric.

One way to mitigate this shift is to give the average min-height CSS property to the tweet parent div tag because it is impossible to know the height of the tweet post before it loads so we can pre-allocate space.

Another way to fix this is to apply a CSS rule to the parent div tag containing the tweet to fix the height.

#tweet-div {
max-height: 300px;
overflow: auto;
}

However, it will cause a scrollbar to appear, and users will have to scroll to view the tweet, which may not be best for user experience.

If none of the suggested methods works, you could take a screenshot of the tweet and link to it.

Web-Based Fonts

Downloaded web fonts can cause what’s known as Flash of invisible text (FOIT).

A way to prevent that is to use preload fonts

<link rel=”preload” href=”https://www.example.com/fonts/inter.woff2″ as=”font” type=”font/woff2″ crossorigin>

and using font-display: swap; css property on @font-face at-rule.

@font-face {
font-family: Inter;
font-style: normal;
font-weight: 200 900;
font-display: swap;
src: url(‘https://www.example.com/fonts/inter.woff2’) format(‘woff2’);
}

With these rules, you are loading web fonts as quickly as possible and telling the browser to use the system font until it loads the web fonts. As soon as the browser finishes loading the fonts, it swaps the system fonts with the loaded web fonts.

However, you may still have an effect called Flash of Unstyled Text (FOUT), which is impossible to avoid when using non-system fonts because it takes some time until web fonts load, and system fonts will be displayed during that time.

In the video below, you can see how the title font is changed by causing a shift.

The visibility of FOUT depends on the user’s connection speed if the recommended font loading mechanism is implemented.

If the user’s connection is sufficiently fast, the web fonts may load quickly enough and eliminate the noticeable FOUT effect.

Therefore, using system fonts whenever possible is a great approach, but it may not always be possible due to brand style guidelines or specific design requirements.

CSS Or JavaScript Animations

When animating HTML elements’ height via CSS or JS, for example, it expands an element vertically and shrinks by pushing down content, causing a layout shift.

To prevent that, use CSS transforms by allocating space for the element being animated. You can see the difference between CSS animation, which causes a shift on the left, and the same animation, which uses CSS transformation.

How Cumulative Layout Shift Is Calculated

This is a product of two metrics/events called “Impact Fraction” and “Distance Fraction.”

CLS = ( Impact Fraction)×( Distance Fraction)
Impact Fraction

Impact fraction measures how much space an unstable element takes up in the viewport.

A viewport is what you see on the mobile screen.

When an element downloads and then shifts, the total space that the element occupies, from the location that it occupied in the viewport when it’s first rendered to the final location when the page is rendered.

The example that Google uses is an element that occupies 50% of the viewport and then drops down by another 25%.

When added together, the 75% value is called the Impact Fraction, and it’s expressed as a score of 0.75.

Distance Fraction

The second measurement is called the Distance Fraction. The distance fraction is the amount of space the page element has moved from the original to the final position.

In the above example, the page element moved 25%.

So now the Cumulative Layout Score is calculated by multiplying the Impact Fraction by the Distance Fraction:

0.75 x 0.25 = 0.1875

The calculation involves some more math and other considerations. What’s important to take away from this is that the score is one way to measure an important user experience factor.

Here is an example video visually illustrating what impact and distance factors are:

Understand Cumulative Layout Shift

Understanding Cumulative Layout Shift is important, but it’s not necessary to know how to do the calculations yourself.

However, understanding what it means and how it works is key, as this has become part of the Core Web Vitals ranking factor.

More resources: 

Featured image credit: BestForBest/Shutterstock

]]>
https://funspinparty.com/understanding-optimizing-cumulative-layout-shift-cls-via-sejournal-vahandev/feed/ 0
How Our Website Conversion Strategy Increased Business Inquiries by 37% https://funspinparty.com/how-our-website-conversion-strategy-increased-business-inquiries-by-37/ https://funspinparty.com/how-our-website-conversion-strategy-increased-business-inquiries-by-37/#respond Mon, 12 Aug 2024 10:16:42 +0000 https://funspinparty.com/?p=72389

Having a website that doesn’t convert is a little like having a bucket with a hole in it. Do you keep filling it up while the water’s pouring out — or do you fix the hole then add water? In other words, do you channel your budget into attracting people who are “pouring” through without taking action, or do you fine-tune your website so it’s appealing enough for them to stick around?

Our recommendation? Optimize the conversion rate of your website, before you spend on increasing your traffic to it.

Here’s a web design statistic to bear in mind: you have 50 milliseconds to make a good first impression. If your site’s too slow, or unattractive, or the wording isn’t clear, they’ll bounce faster than you can say “leaky bucket”. Which is a shame, because you’ve put lots of effort into designing a beautiful product page and About Us, and people just aren’t getting to see it.

As a digital web design and conversion agency in Melbourne, Australia, we’ve been helping our customers optimize their websites for over 10 years, but it wasn’t until mid-2019 that we decided to turn the tables and take a look at our own site.

As it turned out, we had a bit of a leaky bucket situation of our own: while our traffic was good and conversions were okay, there was definitely room for improvement.

In this article, I’m going to talk a little more about conversions: what they are, why they matter, and how they help your business. I’ll then share how I made lots of little tweaks that cumulatively led to my business attracting a higher tier of customers, more inquiries, plus over $780,000 worth of new sales opportunities within the first 26 weeks of making some of those changes. Let’s get into it!

What is conversion?

Your conversion rate is a figure that represents the percentage of visitors who come to your site and take the desired action, e.g. subscribing to your newsletter, booking a demo, purchasing a product, and so on.

Conversions come in all shapes and sizes, depending on what your website does. If you sell a product, making a sale would be your primary goal (aka a macro-conversion). If you run, say, a tour company or media outlet, then subscribing or booking a consultation might be your primary goal.

If your visitor isn’t quite ready to make a purchase or book a consultation, they might take an intermediary step — like signing up to your free newsletter, or following you on social media. This is what’s known as a micro-conversion: a little step that leads towards (hopefully) a bigger one.

A quick recap

A conversion can apply to any number of actions — from making a purchase, to following on social media.

Macro-conversions are those we usually associate with sales: a phone call, an email, or a trip to the checkout. These happen when the customer has done their research and is ready to leap in with a purchase. If you picture the classic conversion funnel, they’re already at the bottom.

Conversion funnel showing paying clients at the bottom.

Micro-conversions, on the other hand, are small steps that lead toward a sale. They’re not the ultimate win, but they’re a step in the right direction.

Most sites and apps have multiple conversion goals, each with its own conversion rate.

Micro-conversions vs. macro-conversions: which is better?

The short answer? Both. Ideally, you want micro- and macro-conversions to be happening all the time so you have a continual flow of customers working their way through your sales funnel. If you have neither, then your website is behaving like a leaky bucket.

Here are two common issues that seem like good things, but ultimately lead to problems:

High web traffic (good thing) but no micro- or macro-conversions (bad thing — leaky bucket alert)

High web traffic (good thing) plenty of micro-conversions (good thing), but no macro conversions (bad thing)

A lot of businesses spend heaps of money making sure their employees work efficiently, but less of the budget goes into what is actually one of your best marketing tools: your website.

Spending money on marketing will always be a good thing. Getting customers to your site means more eyes on your business — but when your website doesn’t convert visitors into sales, that’s when you’re wasting your marketing dollars. When it comes to conversion rate statistics, one of the biggest eye-openers I read was this: the average user’s attention span has dropped from 12 to a mere 7 seconds. That’s how long you’ve got to impress before they bail — so you’d better make sure your website is fast, clear, and attractive.

Our problem

Our phone wasn’t ringing as much as we’d have liked, despite spending plenty of dollars on SEO and Adwords. We looked into our analytics and realized traffic wasn’t an issue: a decent number of people were visiting our site, but too few were taking action — i.e. inquiring. Here’s where some of our issues lay:

Our site wasn’t as fast as it could have been (anything with a load time of two seconds or over is considered slow. Ours was hovering around 5-6, and that was having a negative impact on conversions).

Our CTA conversions were low (people weren’t clicking — or they were dropping off because the CTA wasn’t where it needed to be).

We were relying on guesswork for some of our design decisions — which meant we had no way of measuring what worked, and what didn’t.

In general, things were good but not great. Or in other words, there was room for improvement.

What we did to fix it

Improving your site’s conversions isn’t a one-size-fits all thing — which means what works for one person might not work for you. It’s a gradual journey of trying different things out and building up successes over time. We knew this having worked on hundreds of client websites over the years, so we went into our own redesign with this in mind. Here are some of the steps we took that had an impact.

We decided to improve our site

First of all, we decided to fix our company website. This sounds like an obvious one, but how many times have you thought “I’ll do this really important thing”, then never gotten round to it. Or rushed ahead in excitement, made a few tweaks yourself, then let your efforts grind to a halt because other things took precedence?

This is an all-too-common problem when you run a business and things are just… okay. Often there’s no real drive to fix things and we fall back into doing what seems more pressing: selling, talking to customers, and running the business.

Deciding you want to improve your site’s conversions starts with a decision that involves you and everyone else in the company, and that’s what we did. We got the design and analytics experts involved. We invested time and money into the project, which made it feel substantial. We even made EDMs to announce the site launch (like the one below) to let everyone know what we’d been up to. In short, we made it feel like an event.

Graphic showing hummingbird flying in front of desktop monitor with text

We got to know our users

There are many different types of user: some are ready to buy, some are just doing some window shopping. Knowing what type of person visits your site will help you create something that caters to their needs.

We looked at our analytics data and discovered visitors to our site were a bit of both, but tended to be more ready to buy than not. This meant we needed to focus on getting macro-conversions — in other words, make our site geared towards sales — while not overlooking the visitors doing some initial research. For those users, we implemented a blog as a way to improve our SEO, educate leads, and build up our reputation.

User insight can also help you shape the feel of your site. We discovered that the marketing managers we were targeting at the time were predominantly women, and that certain images and colours resonated better among that specific demographic. We didn’t go for the (obvious pictures of the team or our offices), instead relying on data and the psychology of attraction to delve into the mind of the users.

Chromatix website home page showing a bright pink flower and text.
Chromatix web page showing orange hummingbird and an orange flower.We improved site speed

Sending visitors to good sites with bad speeds erodes trust and sends them running. Multiple studies show that site speed matters when it comes to conversion rates. It’s one of the top SEO ranking factors, and a big factor when it comes to user experience: pages that load in under a second convert around 2.5 times higher than pages taking five seconds or more.

Bar chart showing correlation between fast loading pages and a higher conversion rate.

We built our website for speed. Moz has a great guide on page speed best practices, and from that list, we did the following things:

We optimized images.

We managed our own caching.

We compressed our files.

We improved page load times (Moz has another great article about how to speed up time to first Byte). A good web page load time is considered to be anything under two seconds — which we achieved.

In addition, we also customized our own hosting to make our site faster.

We introduced more tracking

As well as making our site faster, we introduced a lot more tracking. That allowed us to refine our content, our messaging, the structure of the site, and so on, which continually adds to the conversion.

We used Google Optimize to run A/B tests across a variety of things to understand how people interacted with our site. Here are some of the tweaks we made that had a positive impact:

Social proofing can be a really effective tool if used correctly, so we added some stats to our landing page copy.

Google Analytics showed us visitors were reaching certain pages and not knowing quite where to go next, so we added CTAs that used active language. So instead of saying, “If you’d like to find out more, let us know”, we said “Get a quote”, along with two options for getting in touch.

We spent an entire month testing four words on our homepage. We actually failed (the words didn’t have a positive impact), but it allowed us to test our hypothesis. We did small tweaks and tests like this all over the site.

Analytics data showing conversion rates.

We used heat mapping to see where visitors were clicking, and which words caught their eye. With this data, we knew where to place buttons and key messaging.

We looked into user behavior

Understanding your visitor is always a good place to start, and there are two ways to go about this:

Quantitative research (numbers and data-based research)

Qualitative research (people-based research)

We did a mixture of both.

For the quantitative research, we used Google Analytics, Google Optimize, and Hotjar to get an in-depth, numbers-based look at how people were interacting with our site.

Heat-mapping software, Hotjar, showing how people click and scroll through a page.

Heat-mapping software shows how people click and scroll through a page. Hot spots indicate places where people naturally gravitate.

We could see where people were coming into our site (which pages they landed on first), what channel brought them there, which features they were engaging with, how long they spent on each page, and where they abandoned the site.

For the qualitative research, we focused primarily on interviews.

We asked customers what they thought about certain CTAs (whether they worked or not, and why).

We made messaging changes and asked customers and suppliers whether they made sense.

We invited a psychologist into the office and asked them what they thought about our design.

What we learned

We found out our design was good, but our CTAs weren’t quite hitting the mark. For example, one CTA only gave the reader the option to call. But, as one of our interviewees pointed out, not everyone likes using the phone — so we added an email address.

We were intentional but ad hoc about our asking process. This worked for us — but you might want to be a bit more formal about your approach (Moz has a great practical guide to conducting qualitative usability testing if you’re after a more in-depth look).

The results

Combined, these minor tweaks had a mighty impact. There’s a big difference in how our site looks and how we rank. The bottom line: after the rebuild, we got more work, and the business did much better. Here are some of the gains we’ve seen over the past two years.

Pingdom website speed test for Chromatix.

Our dwell time increased by 73%, going from 1.5 to 2.5 minutes.

We received four-times more inquiries by email and phone.

Our organic traffic increased despite us not channeling more funds into PPC ads.

Graph showing an increase in organic traffic from January 2016 to January 2020.
Graph showing changes in PPC ad spend over time.

We also realized our clients were bigger, paying on average 2.5 times more for jobs: in mid-2018, our average cost-per-job was $8,000. Now, it’s $17,000.

Our client brand names became more recognizable, household names — including two of Australia’s top universities, and a well-known manufacturing/production brand.

Within the first 26 weeks, we got over $770,000 worth of sales opportunities (if we’d accepted every job that came our way).

Our prospects began asking to work with us, rather than us having to persuade them to give us the business.

We started getting higher quality inquiries — warmer leads who had more intent to buy.

Some practical changes you can make to improve your website conversions

When it comes to website changes, it’s important to remember that what works for one person might not work for you.

We’ve used site speed boosters for our clients before and gotten really great results. At other times, we’ve tried it and it just broke the website. This is why it’s so important to measure as you go, use what works for your individual needs, and remember that “failures” are just as helpful as wins.

Below are some tips — some of which we did on our own site, others are things we’ve done for others.

Tip number 1: Get stronger hosting that allows you to consider things like CDNs. Hiring a developer should always be your top choice, but it’s not always possible to have that luxury. In this instance, we recommend considering CDNs, and depending on the build of your site, paying for tools like NitroPack which can help with caching and compression for faster site speeds.

Tip number 2: Focus your time. Identify top landing pages with Moz Pro and channel your efforts in these places as a priority. Use the 80/20 principle and put your attention on the 20% that gets you 80% of your success.

Tip number 3: Run A/B tests using Google Optimize to test various hypotheses and ideas (Moz has a really handy guide for running split tests using Google). Don’t be afraid of the results — failures can help confirm that what you are currently doing right. You can also access some in-depth data about your site’s performance in Google Lighthouse.

Site performance data in Google Lighthouse.

Tip number 4: Trial various messages in Google Ads (as a way of testing targeted messaging). Google provides many keyword suggestions on trending words and phrases that are worth considering.

Tip number 5: Combine qualitative and quantitative research to get to know how your users interact with your site — and keep testing on an ongoing basis.

Tip number 6: Don’t get too hung up on charts going up, or figures turning orange: do what works for you. If adding a video to your homepage slows it down a little but has an overall positive effect on your conversion, then it’s worth the tradeoff.

Tip number 7: Prioritize the needs of your target customers and focus every build and design choice around them.

Recommended tools

Nitropack: speed up your site if you’ve not built it for speed from the beginning.

Google Optimize: run A/B tests

HotJar: see how people use your site via heat mapping and behaviour analytics.

Pingdom / GTMetrix: measure site speed (both is better if you want to make sure you meet everyone’s requirements).

Google Analytics: find drop-off points, track conversion, A/B test, set goals.

Qualaroo: poll your visitors while they are on your site with a popup window.

Google Consumer Surveys: create a survey, Google recruits the participants and provides results and analysis.

Moz Pro: Identify top landing pages when you connect this tool to your Google Analytics profile to create custom reports.

How to keep your conversion rates high

Treat your website like your car. Regular little tweaks to keep it purring, occasional deeper inspections to make sure there are no problems lurking just out of sight. Here’s what we do:

We look at Google Analytics monthly. It helps to understand what’s working, and what’s not.

We use goal tracking in GA to keep things moving in the right direction.

We use Pingdom’s free service to monitor the availability and response time of our site.

We regularly ask people what they think about the site and its messaging (keeping the qualitative research coming in).

Conclusion

Spending money on marketing is a good thing, but when you don’t have a good conversion rate, that’s when your website’s behaving like a leaky bucket. Your website is one of your strongest sales tools, so it really does pay to make sure it’s working at peak performance.

I’ve shared a few of my favorite tools and techniques, but above all, my one bit of advice is to consider your own requirements. You can improve your site speed if you remove all tags and keep it plain. But that’s not what you want: it’s finding the balance between creativity and performance, and that will always depend on what’s important.

For us as a design agency, we need a site that’s beautiful and creative. Yes, having a moving background on our homepage slows it down a little bit, but it improves our conversions overall.

The bottom line: Consider your unique users, and make sure your website is in line with the goals of whoever you’re speaking with.

We can do all we want to please Google, but when it comes to sales and leads, it means more to have a higher converting and more effective website. We did well in inquiries (actual phone calls and email leads) despite a rapid increase in site performance requirements from Google. This only comes down to one thing: having a site customer conversion framework that’s effective.

]]>
https://funspinparty.com/how-our-website-conversion-strategy-increased-business-inquiries-by-37/feed/ 0
Delft startup bags €6.5M for nanotech that enables material discovery in 1 year https://funspinparty.com/delft-startup-bags-e6-5m-for-nanotech-that-enables-material-discovery-in-1-year/ https://funspinparty.com/delft-startup-bags-e6-5m-for-nanotech-that-enables-material-discovery-in-1-year/#respond Mon, 12 Aug 2024 10:16:11 +0000 https://funspinparty.com/?p=72387

Delft-based VSParticle has raised €6.5mn to accelerate material discovery for industrial solutions, such as the production of green hydrogen.

Discovering new materials in the lab typically takes years, often close to 10 or even more. VSParticle aims to change this paradigm and reduce discovery time to as little as one year.

To do this, the company has tapped nanotechnology. It develops tools that can break down, synthesise, and manipulate materials at the nano-scale. Nano-scale particles of inorganic materials, also known as nanoparticles, exhibit unique properties that, when combined, can lead to the creation of novel materials.

The startup offers various tools for automated nanoparticle generation, synthesis, deposition, and prototyping which it says enable researchers and commercial R&D teams to experiment with new material creation. The integration of advanced computational tools into these processes also speeds up the identification and optimisation of novel materials. 

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol’ founder Boris, and some questionable AI art. It’s free, every week, in your inbox. Sign up now!

According to Aaike van Vugt, co-founder and CEO at VSParticle, this accelerates the transition from lab to real-world applications and mass production.

“This is what enables us to bring down the material discovery time from a decade to only one year,” van Vugt told TNW. 

The company’s flagship product, the VSP-P1 Nanoprinter, is already in use by various research teams across the world, including the Sorbonne University Abu Dhabi and the Dutch Institute for Fundamental Energy Research.

Team of scientist working at VSParticle's nanoprinter The VSP-P1 Nanoprinters. Credit: VSParticle

The printer can not only generate the desired nanoparticles, but also print them directly into a new product or integrate them as additional features “at the push of a button,” van Vugt said.

It comes with a user interface that enables researchers to easily modify parameters and experiment with various material configurations, making it possible to quickly test and refine different compositions.

Material discovery for energy applications

VSParticle’s technology can be used for a wide range of applications, from sensors to medical devices. It’s also particularly suitable for sustainable energy solutions.

One of them is the production of green hydrogen.

Catalyst-coated Porous Transport Layers (PTLs) are key components of electrolysers, which are essential for green hydrogen production. The technology enables the mass production of PTLs without relying on scarce materials like platinum and iridium — typically used for the catalysts.

VSParticle says its printers can help create new material combinations for PTLs. It expects 10x savings in scarce metals such as iridium, and the faster development of new products.

According to the startup, the first components using its technology will hit the market by 2027, leading to end products that can help scale green hydrogen production.

With the money, VSParticle plans to further develop its technology, aiming for next-generation printers with an up to 100 times higher output. It will also expand to Japan and double down in Europe and the US.

NordicNinja, the largest Japanese-backed VC firm in Europe, and previous investor Plural led the round. Existing backer Hermann Hauser Investment also participated. The capital injection brings the startup’s total funding amount to €24.5mn.

“Less than 1% of all possible inorganic materials have been unlocked,” van Vugt said. “The bigger goal of VSParticle is to unlock the other 99%.”

]]>
https://funspinparty.com/delft-startup-bags-e6-5m-for-nanotech-that-enables-material-discovery-in-1-year/feed/ 0
Google rolls out new features for Local Service Ads https://funspinparty.com/google-rolls-out-new-features-for-local-service-ads/ https://funspinparty.com/google-rolls-out-new-features-for-local-service-ads/#respond Mon, 12 Aug 2024 10:13:49 +0000 https://funspinparty.com/?p=72384

Google unveiled new features for Local Services Ads (LSA) this week. One automatically selects profile photos to display in advertisements, while the others give advertisers more control over their spending.

Profile photos

This update aims to increase ad engagement and potentially improve ad rankings for local service providers.

How it works:

Google will choose photos from advertisers’ LSA profiles based on their likelihood to boost engagement.

Photos won’t appear in every ad, depending on user queries and other factors.

Google’s recommendations:

Upload 3-5 high-quality images to your LSA profile.

Ensure photos are relevant to your work, original, and not copied or stolen.

What to watch. How this change affects ad performance and ranking for local service providers, especially those who haven’t previously focused on visual content in their profiles.

Are you getting the most from your stack? Take our brief 2024 MarTech Replacement Survey

Ad spending controls

The new ad budget features include the ability to set a maximum monthly ad spend limit for certain accounts. This gives advertisers more control over their spending, potentially preventing unexpected budget overruns. 

Key features:

Immediate effect upon setting.

Automatic campaign stoppage when limit is reached.

Monthly reset on the 1st of each month.

Flexible management – can be updated anytime.

How it works. Advertisers can toggle the account spend limit on or off, set a specific monthly limit, view last month’s spend and monitor current month’s spend and remaining budget through the Local Service Ads interface.

Dig deeper: Why marketers must combat the hidden threat of MFA sites

Yes, but. Due to reporting lag, there’s a possibility of exceeding the set limit, especially when:

Setting a limit for the first time.

Lowering an existing limit.

Advertisers are responsible for paying any excess charges if the limit is exceeded.

The catch. If the limit is exceeded, all ads will stop running until the next month.

Email:

Business email address

Sign me up!
   Processing…

See terms.

The post Google rolls out new features for Local Service Ads appeared first on MarTech.

]]>
https://funspinparty.com/google-rolls-out-new-features-for-local-service-ads/feed/ 0
UN approves cybercrime treaty amid human rights concerns https://funspinparty.com/un-approves-cybercrime-treaty-amid-human-rights-concerns/ https://funspinparty.com/un-approves-cybercrime-treaty-amid-human-rights-concerns/#respond Mon, 12 Aug 2024 10:12:49 +0000 https://funspinparty.com/?p=72382

The human rights wing of the UN urged states to ensure the cybercrime treaty ‘has human rights at its heart’.

Read more: UN approves cybercrime treaty amid human rights concerns

]]>
https://funspinparty.com/un-approves-cybercrime-treaty-amid-human-rights-concerns/feed/ 0
Google’s “Branded Search” Patent For Ranking Search Results via @sejournal, @martinibuster https://funspinparty.com/googles-branded-search-patent-for-ranking-search-results-via-sejournal-martinibuster/ https://funspinparty.com/googles-branded-search-patent-for-ranking-search-results-via-sejournal-martinibuster/#respond Mon, 12 Aug 2024 10:10:56 +0000 https://funspinparty.com/?p=72379

Back in 2012 Google applied for a patent called “Ranking Search Results” that shows how Google can use branded search queries as a ranking factor. The patent is about using branded search queries and navigational queries as ranking factors, plus a count of independent links. Although this patent is from 2012, it’s possible that it may still play a role in ranking.

The patent was misunderstood by the search marketing community in 2012 and the knowledge contained in it was lost.

What Is The Ranking Search Results Patent About? TL/DR

The patent is explicitly about an invention for ranking search results, that’s why the patent is called “Ranking Search Results.” The patent describes an algorithm that uses to ranking factors to re-rank web pages:

Sorting Factor 1: By number of independent inbound linksThis is a count of links that are independent from the site being ranked.

Sorting Factor 2: By number of branded search queries & navigational search queries.The branded and navigational search queries are called “reference queries” and also are referred to as implied links.

The counts of both factors are used to modify the rankings of the web pages.

Why The Patent Was Misunderstood TL/DR

First, I want to say that in 2012, I didn’t understand how to read patents. I was more interested in research papers and left the patent reading to others. When I say that everyone in the search marketing community misunderstood the patent, I include myself in that group.

The “Ranking Search Results” patent was published in 2012, one year after the release of a content quality update called the Panda Update. The Panda update was named after one of the engineers who worked on it, Navneet Panda. Navneet Panda came up with questions that third party quality raters used to rate web pages. Those ratings were used as a test to see if changes to the algorithm were successful at removing “content farm” content.

Navneet Panda is also a co-author of the “Ranking search results” patent. SEOs saw his name on the patent and immediately assumed that this was the Panda patent.

The reason why that assumption is wrong is because the Panda update is an algorithm that uses a “classifier” to classify web pages by content quality. The “Ranking Search Results” patent is about ranking search results, period. The Ranking Search Results patent is not about content quality nor does it feature a content quality classifier.

Nothing in the “Ranking Search Results” patent relates in any way with the Panda update.

Why This Patent Is Not The Panda Update

In 2009 Google released the Caffeine Update which enabled Google to quickly index fresh content but inadvertently created a loophole that allowed content farms to rank millions of web pages on rarely searched topics.

In an interview with Wired, former Google search engineer Matt Cutts described the content farms like this:

“It was like, “What’s the bare minimum that I can do that’s not spam?” It sort of fell between our respective groups. And then we decided, okay, we’ve got to come together and figure out how to address this.”

Google subsequently responded with the Panda Update, named after a search engineer who worked on the algorithm which was specifically designed to filter out content farm content. Google used third party site quality raters to rate websites and the feedback was used to create a new definition of content quality that was used against content farm content.

Matt Cutts described the process:

“There was an engineer who came up with a rigorous set of questions, everything from. “Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?” Questions along those lines.

…we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons…”

In simple terms, a classifier is an algorithm within a system that categorizes data. In the context of the Panda Update, the classifier categorizes web pages by content quality.

What’s apparent when reading the “Ranking search results” patent is that it’s clearly not about content quality, it’s about ranking search results.

Meaning Of Express Links And Implied Links

The “Ranking Search Results” patent uses two kinds of links to modify ranked search results:

Implied links
Express links

Implied links: The patent uses branded search queries and navigational queries to calculate a ranking score as if the branded/navigational queries are links, calling them implied links. The implied links are used to create a factor for modifying web pages that are relevant (responsive) to search queries.

Express links: The patent also uses independent inbound links to the web page as a part of another calculation to come up with a factor for modifying web pages that are responsive to a search query.

Both of those kinds of links (implied and independent express link) are used as factors to modify the rankings of a group of web pages.

Understanding what the patent is about is straightforward because the beginning of the patent explains it in relatively easy to understand English.

This section of the patent uses the following jargon:

A resource is a web page or website.
A target (target resource) is what is being linked to or referred to.
A “source resource” is a resource that makes a citation to the “target resource.”
The word “group” means the group of web pages that are relevant to a search query and are being ranked.

The patent talks about “express links” which are just regular links. It also describes “implied links” which are references within search queries, references to a web page (which is called a “target resource”).

I’m going to add bullet points to the original sentences so that they are easier to understand.

Okay, so this is the first important part:

“Links for the group can include express links, implied links, or both.

An express link, e.g., a hyperlink, is a link that is included in a source resource that a user can follow to navigate to a target resource.

An implied link is a reference to a target resource, e.g., a citation to the target resource, which is included in a source resource but is not an express link to the target resource. Thus, a resource in the group can be the target of an implied link without a user being able to navigate to the resource by following the implied link.”

The second important part uses the same jargon to define what implied links are:

A resource is a web page or website.
The site being linked to or referred to is called a “target resource.”
A “group of resources” means a group of web pages.

This is how the patent explains implied links:

“A query can be classified as referring to a particular resource if the query includes a term that is recognized by the system as referring to the particular resource.

For example, a term that refers to a resource may be all of or a portion of a resource identifier, e.g., the URL, for the resource.

For example, the term “example.com” may be a term that is recognized as referring to the home page of that domain, e.g., the resource whose URL is “http://www.example.com”.

Thus, search queries including the term “example.com” can be classified as referring to that home page.

As another example, if the system has data indicating that the terms “example sf” and “esf” are commonly used by users to refer to the resource whose URL is “http://www.sf.example.com,” queries that contain the terms “example sf” or “esf”, e.g., the queries “example sf news” and “esf restaurant reviews,” can be counted as reference queries for the group that includes the resource whose URL is “http://www.sf.example.com.” “

The above explanation defines “reference queries” as the terms that people use to refer to a specific website. So, for example (my example), if people search using “Walmart” with the keyword Air Conditioner within their search query then the query  “Walmart” + Air Conditioner is counted as a “reference query” to Walmart.com, it’s counted as a citation and an implied link.

The Patent Is Not About “Brand Mentions” On Web Pages

Some SEOs believe that a mention of a brand on a web page is counted by Google as if it’s a link. They have misinterpreted this patent to support the belief that an “implied link” is a brand mention on a web page.

As you can see, the patent does not describe the use of “brand mentions” on web pages. It’s crystal clear that the meaning of “implied links” within the context of this patent is about references to brands within search queries, not on a web page.

It also discusses doing the same thing with navigational queries:

“In addition or in the alternative, a query can be categorized as referring to a particular resource when the query has been determined to be a navigational query to the particular resource. From the user point of view, a navigational query is a query that is submitted in order to get to a single, particular web site or web page of a particular entity. The system can determine whether a query is navigational to a resource by accessing data that identifies queries that are classified as navigational to each of a number of resources.”

The takeaway then is that the parent describes the use of “reference queries” (branded/navigational search queries) as a factor similar to links and that’s why they’re called implied links.

Modification Factor

The algorithm generates a “modification factor” which re-ranks (modifies) the a group of web pages that are relevant to a search query based on the “reference queries” (which are branded search queries) and also using a count of independent inbound links.

This is how the modification (or ranking) is done:

A count of inbound links using only “independent” links (links that are not controlled by the site being linked to).
A count is made of the reference queries (branded search queries) (which are given a ranking power like a link).

Reminder: “resources” is a reference to web pages and websites.

Here is how the patent explains the part about the ranking:

“The system generates a modification factor for the group of resources from the count of independent links and the count of reference queries… For example, the modification factor can be a ratio of the number of independent links for the group to the number of reference queries for the group.”

What the patent is doing is it is filtering links in order to use links that are not associated with the website and it is also counting how many branded search queries are made for a webpage or website and using that as a ranking factor (modification factor).

In retrospect it was a mistake for some in the SEO industry to use this patent as “proof” for their idea about brand mentions on websites being a ranking factor.

It’s clear that “implied links” are not about brand mentions in web pages as a ranking factor but rather it’s about brand mentions (and URLs & domains) in search queries that can be used as ranking factors.

Why This Patent Is Important

This patent describes a way to use branded search queries as a signal of popularity and relevance for ranking web pages. It’s a good signal because it’s the users themselves saying that a specific website is relevant for specific search queries. It’s a signal that’s hard to manipulate which may make it a clean non-spam signal.

We don’t know if Google uses what’s described in the patent. But it’s easy to understand why it could still be a relevant signal today.

Read The Patent Within The Entire Context

Patents use specific language and it’s easy to misinterpret the words or overlook the meaning of it by focusing on specific sentences. The biggest mistake I see SEOs do is to remove one or two sentences from their context and then use that to say that Google is doing something or other. This is how SEO misinformation begins.

Read my article about How To Read Google Patents to understand how to read them and avoid misinterpreting them. Even if you don’t read patents, knowing the information is helpful because it’ll make it easier to spot misinformation about patents, which there is a lot of right now.

I limited this article to communicating what the “Ranking Search Results” patent is and what the most important points are. There many granular details about different implementations that I don’t cover because they’re not necessary to understanding the overall patent itself.

If you want the granular details, I strongly encourage first reading my article about how to read patents before reading the patent.

Read the patent here:

Ranking search results

]]>
https://funspinparty.com/googles-branded-search-patent-for-ranking-search-results-via-sejournal-martinibuster/feed/ 0
Google DeepMind’s new AI robot plays table tennis at ‘human level’ https://funspinparty.com/google-deepminds-new-ai-robot-plays-table-tennis-at-human-level/ https://funspinparty.com/google-deepminds-new-ai-robot-plays-table-tennis-at-human-level/#respond Mon, 12 Aug 2024 10:09:21 +0000 https://funspinparty.com/?p=72376

China is currently busy accumulating most of the gold medals in the table tennis events in the Paris Olympics. Meanwhile, an AI-powered robot from Google DeepMind has achieved “amateur human level performance” in the sport. 

In a study published in an Arxiv paper this week, the Google artificial intelligence subsidiary outlined how the robot functions, along with footage of it taking on what we can only assume were willing enthusiastic ping pong players of varying skill. 

According to DeepMind, the racket-wielding robot had to be good at low-level skills, like returning the ball, as well as more complex tasks, like long-term planning and strategising. It also played against opponents with diverse styles, drawing on vast amounts of data to refine and adapt its approach. 

Not Olympic level quite yet

The robotic arm — and its 3D-printed racket — won 13 out of 29 games against human opponents with different levels of skill in the game. It won 100% of matches against “beginner” and 55% against “intermediate” players. However, it lost every single time that it faced an “advanced” opponent. 

TNW Conference 2025 – Back to NDSM on June 19-20, 2025 – Save the date!

As we wrapped up our incredible 2024 edition, we’re pleased to announce our return to Amsterdam NDSM in 2025. Registration now!

DeepMind said the results of the recent project constitute a step towards the goal of achieving human-level speed and performance on real world tasks, a “north star” for the robotics community. 

In order to achieve them, its researchers say they made use of four applications that could also make the findings useful beyond hitting a small ball over a tiny net, difficult though it may be: 

A hierarchical and modular policy architecture
Techniques to enable zero-shot sim-to-real including an iterative approach to defining the training task distribution grounded in the real-world
Real-time adaptation to unseen opponents
A user-study to test the model playing actual matches against unseen humans in physical environments 

The company further added that its approach had led to ​​”competitive play at human level and a robot agent that humans actually enjoy playing with.” Indeed, its non-robot competitors in the demonstration videos do seem to be enjoying themselves.  

Table tennis robotics

Google DeepMind is not the only robotics company to choose table tennis to train their systems. The sport requires hand-eye coordination, strategic thinking, speed, and adaptability, among other things, making it well suited to train and test these skills in AI-powered robots. 

The world’s “first robot table tennis tutor” was acknowledged in 2017 by Guinness World Records in 2017. The rather imposing machine was developed by Japanese electronics company OMRON. Its latest iteration is the FORPHEUS (stands for “Future OMRON Robotics technology for Exploring Possibility of Harmonized aUtomation with Sinic theoretics,” and is also inspired by the ancient mythological figure Orpheus…). 

OMRON says it “embodies the relationship that will exist between humans and technology in the future.”

Google DeepMind makes no such existential claims for its recent ping pong champion, but the findings from its development may still prove profound for our robot friends down the line. We do however feel that DeepMind’s robotic arm is severely lacking in the abbreviation department.

]]>
https://funspinparty.com/google-deepminds-new-ai-robot-plays-table-tennis-at-human-level/feed/ 0
Cannibalization https://funspinparty.com/cannibalization/ https://funspinparty.com/cannibalization/#respond Mon, 12 Aug 2024 10:07:24 +0000 https://funspinparty.com/?p=72374

In today’s episode of Whiteboard Friday, Tom Capper walks you through a problem many SEOs have faced: cannibalization. What is it, how do you identify it, and how can you fix it? Watch to find out! 

Photo of the whiteboard describing cannibalization.Click on the whiteboard image above to open a larger version in a new tab!

Video Transcription

Happy Friday, Moz fans, and today we’re going to be talking about cannibalization, which here in the UK we spell like this: cannibalisation. With that out of the way, what do we mean by cannibalization?

What is cannibalization?

So this is basically where one site has two competing URLs and performs, we suspect, less well because of it. So maybe we think the site is splitting its equity between its two different URLs, or maybe Google is getting confused about which one to show. Or maybe Google considers it a duplicate content problem or something like that. One way or another, the site does less well as a result of having two URLs. 

So I’ve got this imaginary SERP here as an example. So imagine that Moz is trying to rank for the keyword “burgers.” Just imagine that Moz has decided to take a wild tangent in its business model and we’re going to try and rank for “burgers” now.

So in position one here, we’ve got Inferior Bergz, and we would hope to outrank these people really, but for some reason we’re not doing. Then in position two, we’ve got Moz’s Buy Burgers page on the moz.com/shop subdirectory, which obviously doesn’t exist, but this is a hypothetical. This is a commercial landing page where you can go and purchase a burger. 

Then in position three, we’ve got this Best Burgers page on the Moz blog. It’s more informational. It’s telling you what are the attributes to a good burger, how can you identify a good burger, where should you go to acquire a good burger, all this kind of more neutral editorial information.

So we hypothesize in this situation that maybe if Moz only had one page going for this keyword, maybe it could actually supplant the top spot. If we think that’s the case, then we would probably talk about this as cannibalization.

However, the alternative hypothesis is, well, actually there could be two intents here. It might be that Google wishes to show a commercial page and an informational page on this SERP, and it so happens that the second best commercial page is Moz’s and the best informational page is also Moz’s. We’ve heard Google talk in recent years or representatives of Google talk in recent years about having positions on search results that are sort of reserved for certain kinds of results, that might be reserved for an informational result or something like that. So this doesn’t necessarily mean there’s cannibalization. So we’re going to talk a little bit later on about how we might sort of disambiguate a situation like this.

Classic cannibalization

First, though, let’s talk about the classic case. So the classic, really clear-cut, really obvious case of cannibalization is where you see a graph like this one. 

Hand drawn graph showing ranking consequences of cannibalization.

So this is the kind of graph you would see a lot of rank tracking software. You can see time and the days of the week going along the bottom axis. Then we’ve got rank, and we obviously want to be as high as possible and close to position one.

Then we see the two URLS, which are color-coded, and are green and red here. When one of them ranks, the other just falls away to oblivion, isn’t even in the top 100. There’s only ever one appearing at the same time, and they sort of supplant each other in the SERP. When we see this kind of behavior, we can be pretty confident that what we’re seeing is some kind of cannibalization.

Less-obvious cases

Sometimes it’s less obvious though. So a good example that I found recently is if, or at least in my case, if I Google search Naples, as in the place name, I see Wikipedia ranking first and second. The Wikipedia page ranking first was about Naples, Italy, and the Wikipedia page at second was about Naples, Florida.

Now I do not think that Wikipedia is cannibalizing itself in that situation. I think that they just happen to have… Google had decided that this SERP is ambiguous and that this keyword “Naples” requires multiple intents to be served, and Wikipedia happens to be the best page for two of those intents.

So I wouldn’t go to Wikipedia and say, “Oh, you need to combine these two pages into a Naples, Florida and Italy page” or something like that. That’s clearly not necessary. 

Questions to ask 

So if you want to figure out in that kind of more ambiguous case whether there’s cannibalization going on, then there are some questions we might ask ourselves.

1. Do we think we’re underperforming? 

So one of the best questions we might ask, which is a difficult one in SEO, is: Do we think we’re underperforming? So I know every SEO in the world feels like their site deserves to rank higher, well, maybe most. But do we have other examples of very similar keywords where we only have one page, where we’re doing significantly better? Or was it the case that when we introduced the second page, we suddenly collapsed? Because if we see behavior like that, then that might,  you know, it’s not clear-cut, but it might give us some suspicions. 

2. Do competing pages both appear? 

Similarly, if we look at examples of similar keywords that are less ambiguous in intent, so perhaps in the burgers case, if the SERP for “best burgers” and the SERP for “buy burgers,” if those two keywords had completely different results in general, then we might think, oh, okay, we should have two separate pages here, and we just need to make sure that they’re clearly differentiated.

But if actually it’s the same pages appearing on all of those keywords, we might want to consider having one page as well because that seems to be what Google is preferring. It’s not really separating out these intents. So that’s the kind of thing we can look for is, like I say, not clear-cut but a bit of a hint. 

3. Consolidate or differentiate? 

Once we’ve figured out whether we want to have two pages or one, or whether we think the best solution in this case is to have two pages or one, we’re going to want to either consolidate or differentiate.

So if we think there should only be one page, we might want to take our two pages, combine the best of the content, pick the strongest URL in terms of backlinks and history and so on, and redirect the other URL to this combined page that has the best content, that serves the slight variance of what we now know is one intent and so on and so forth.

If we want two pages, then obviously we don’t want them to cannibalize. So we need to make sure that they’re clearly differentiated. Now what often happens here is a commercial page, like this Buy Burgers page, ironically for SEO reasons, there might be a block of text at the bottom with a bunch of editorial or SEO text about burgers, and that can make it quite confusing what intent this page is serving.

Similarly, on this page, we might at some stage have decided that we want to feature some products on there or something. It might have started looking quite commercial. So we need to make sure that if we’re going to have both of these, that they are very clearly speaking to separate intents and not containing the same information and the same keywords for the most part and that kind of thing.

Quick tip

Lastly, it would be better if we didn’t get into the situation in the first place. So a quick tip that I would recommend, just as a last takeaway, is before you produce a piece of content, say for example before I produced this Whiteboard Friday, I did a site:moz.com cannibalization so I can see what content had previously existed on Moz.com that was about cannibalization.

I can see, oh, this piece is very old, so we might — it’s a very old Whiteboard Friday, so we might consider redirecting it. This piece mentions cannibalization, so it’s not really about that. It’s maybe about something else. So as long as it’s not targeting that keyword we should be fine and so on and so forth. Just think about what other pieces exist, because if there is something that’s basically targeting the same keyword, then obviously you might want to consider consolidating or redirecting or maybe just updating the old piece.

That’s all for today. Thank you very much.

Video transcription by Speechpad.com. 

]]>
https://funspinparty.com/cannibalization/feed/ 0