Google Introduces WordPress Plugin With Integrated Analytics, Search Console, More

Matt Southern

Matt Southern

 

Google has introduced a new WordPress plugin which brings insights from Google tools to users’ dashboards.

Site Kit by Google allows users to access information in Search Console, Analytics, AdSense, and PageSpeed Insights from the WordPress admin panel.

“With Site Kit installed, WordPress users can access unified insights and Google product capabilities directly from the WordPress admin panel. Where it is helpful, Site Kit will also provide deep links into Google products for advanced reports and product configuration capabilities.”

Google will release Site Kit to beta testers in early 2019. Those who are interested in the plugin can sign up for the beta version here.

Site Kit doesn’t add any new insights that are not already available in Google’s tools, but it does make them easier to access.

For example, users can navigate to a page on their website and click on the Site Kit button in the admin panel to see stats for that specific page.

Google Introduces WordPress Plugin With Integrated Analytics, Search Console, More

The plugin will also notify users when they’ve hit publishing milestones and show combined stats for the most recently published posts.

Google Introduces WordPress Plugin With Integrated Analytics, Search Console, More

Google plans to expand Site Kit’s capabilities and integrations in the future based on feedback from beta testers.

12 Completely Outdated SEO Practices You Should Avoid

Sam Hollingsworth

 

by Sam Hollingsworth

12 Completely Outdated SEO Practices You Should Avoid

SEO has gone through extensive evolutionary changes over the years, and continues to do so every day.

While most traditional marketing tactics (for the most part) still hold true in digital marketing today, SEO changes have quite drastically changed the landscape.

Most, if not all, of these changes have helped improve the web – and search, in particular.

Yet, some people still cling to the “old ways” and try to use outdated SEO practices to improve their brand’s organic search visibility and performance.

Some of the tactics worked a few years ago, but now just aren’t as effective as they used to be.

Yet many novice marketers and/or small business owners are still using these “zombie” SEO techniques (tactics that should be dead, but aren’t for some godforsaken reason).

Not only are they ineffective, but many of the 12 outdated SEO practices below are potentially dangerous to the well-being of your brand, websites, and other digital properties.

1. Keyword Abuse

There are so many ways webmasters and “marketers” continue to misunderstand keywords’ role in general SEO initiatives, and how they should be used in the day-to-day strategy.

Let’s take a more granular look at specific types of keyword abuse and mismanagement, including irrelevant usage, writing for a specific keyword density, and keyword stuffing.

Irrelevant Keyword Targeting/Confusion

All too often, novice SEO practitioners try and fit their content and messaging within the confines of their keyword research (and not much else).

These “marketers” will shape the content and its metadata to represent keywords it’s not properly aligned with, nor the proper intent of the users conducting the searches for the high-volume keywords being targeted.

This causes brands to likely lose the attention of readers before ever having the chance to communicate a real message with them.

If the keywords marketed for don’t align with the content on the page, the disconnect will hinder the success of content even if it’s otherwise of good quality.

Don’t try to mislead users and direct them to content that is misrepresented by high-volume keywords in order for increased visibility.

Google knows what this looks like, and it can truly be defined as an obsolete SEO practice (as well as a “black hat” technique, in many instances).

Keyword Density

Writing for a specific “keyword density,” like many keyword-focused marketing tactics, is just missing the mark.

Google no longer depends on keyword density (or the ratio of specific keyword usage to the overall page copy) to determine whether a webpage is an effective source for answering a search query.

It is so much more advanced than simply crawling for keywords; search engines like Google use a multitude of signals to determine search results.

While keywords remain important to the topics and ideas they represent, they are not the lifeline for ranking for high-value search queries.

The quality of content and how the messaging is delivered are the lifeline for that.

Keyword Stuffing

This is probably the oldest trick in the book.

SEO is about keywords, right?

So, loading up our webpages with keywords — especially the same high-value keyword we are aggressively targeting throughout the website — is going to help us show up higher in search, thus outranking out competition?

Absolutely not.

Search engines have, for a long time, known what keyword stuffing is and what kind of text combinations are unnatural. They notice these as attempts to manipulate search results and demote the content as such.

Yes, there may still be valuable content that uses simple keyword stuffing, either intentionally or unintentionally, that is not demoted because of its actual value to users.

Back in the day, webmasters trying to game the system would go as far as putting every keyword variation of a high-value keyword in the website footer or, even more sketchily, make those keywords the same color as the site’s background, effectively hiding them from humans but not the search engine crawlers.

Webmasters have also tried this with links. (Don’t do anything like this.)

Remember, you’re writing for humans, not search engines.

2. Writing for Robots

It’s important to understand that writing unnatural is, well, not natural.

And search engines know it.

The belief is: writing for the web means we should repeat a subject by its proper name every time it is mentioned, working in variations and plural/non-plural versions of the word so that “all bases are covered.”

When crawled, the crawlers see the keyword repeated, and in several different versions, thus leading the page to rank well for the keyword variations used (over and over … and over again).

This isn’t going to work anymore.

Search engines are advanced enough to understand repeated keywords, their variations, and the unfavorable experience of generally bad content.

Write for humans, not search engine crawlers or any other robot.

3. Article Marketing & Article Directories

Any attempt to game the system doesn’t usually work out in the world of SEO.

But that doesn’t stop people from trying.

Especially when these tactics offer noticeable improvements to a brand, its website, and/or its associated digital properties.

Sure, article directories worked. And they worked pretty darn good for a long time, too.

Commonly considered one of earliest forms of digital marketing, article syndication was low-hanging fruit to those in the know. And it made sense since the idea was similar to other channels like TV and print that already use syndicated content regularly.

But Google eventually caught on, unleashing its game-changing Panda update in 2011.

Panda chewed up the search landscape, targeting content farms and directories, as well as other websites offering crap content (whether it was simply bad/false, horribly written, makes no sense, or stolen from someone else).

The idea behind article marketing doesn’t make sense in today’s world, where your high-quality content needs to be original and demonstrate expertise, authority, and trustworthiness.

4. Article Spinning

Typically done with software, article spinning is the black-hat tactic of trying to recreate quality content using different words, phrases, and organization.

Essentially the end result was a garbled mess of an article that made the same points as the source material.

It’s no surprise this isn’t effective anymore.

While AI is getting better all the time at creating content, anything generated by a machine is still of a lower quality than what a human can produce – something original, helpful, and of substance.

5. Buying Links

This one is still biting webmasters many years later.

Like most SEO tactics, if it seems shady, you probably shouldn’t do it.

Buying links is no different.

Once upon a time, it was routine practice to quickly pay to get a high volume of links pointing at your site.

Now we now that backlink profiles need to be maintained and optimized just like the websites we oversee, and low-quality domains with far too many backlinks pointing to a website may be dangerous to a website’s health.

Google can easily identify low-quality sites, and it will also identify when those sites are sending an abundance of links out that they shouldn’t be.

Today if you want to legitimately help boost the authority and visibility of your website, you need to earn links, not pay someone to build them manually.

6. Anchor Text

Internal linking is a characteristic of any good site structure and user experience.

This is typically done with anchor text, an HTML element that allows us to tell users what type of content they can expect if they click on a link.

There are various types of anchor text (branded, naked, exact-match, website/brand name, page title and/or headline, etc.), but some have most certainly become more favorable than others, depending on the usage and situation.

In the past, using exact-match and keyword-rich anchor text were standard SEO best practices.

Since Penguin, Google has been better at identifying over-optimized content.

This goes back to the Golden Rule about producing well-constructed content that is user-friendly and natural.

If you’re optimizing for search engines and not humans, you’re likely going to fail.

7. Obsolete Keyword Research Tactics

Keywords have certainly gone through some drastic changes over the last five to 10 years.

Marketers used to have a plethora of keyword-level data at their fingertips, allowing us to see what works well for our brand and what doesn’t, but also to get a better understanding of idea targeting and user intent.

Much of this went to the wayside with keyword “(not provided)”.

In the years following, tools popped up that tried to replicate keyword data. But to fully recreate it correctly is simply impossible.

And yet, even with that now-stripped keyword data, marketers are required to do keyword research of their own to get an understanding of the industry, the competition, the geographic region, etc.

To do this, many marketers turn to Google’s free Keyword Planner. While the data in there has been subject to some scrutiny over the years, it’s a free Google-owned product that gives us data we previously couldn’t really come by, so many of us continue to use it (myself included).

But it’s important to remember what the data actually represents for keywords.

“Competition” in the Keyword Planner pertains solely to paid competition and traffic, thus it is practically useless to build an organic search strategy around this data.

Some alternatives to this are the Moz Keyword Explorer tool and SEMrush’s Keyword Magic Tool, both of which are paid tools.

Google Trends is helpful for this type of competitive analysis, too, and it’s free.

8.  Pages for All Keyword Variations

This was once a useful tactic to rank well for all the variations of high-value keywords targeted by your brand and its messaging.

Fortunately, algorithm updates like HummingbirdRankBrain, and others have helped Google understand that variations of the same word are, in fact, all related to the same topic.

The best, most-useful content around these entities should be most visible due to the value it offers users on the topic, not just one variation of the word.

Aside from the fact that this will lead to brutal site self-cannibalization, it makes a website considerably harder to use and navigate since content will be so incredibly similar.

The negative user experience alone is reason enough not to do this. But the added fact that Google knows better than to overlook this practice makes it a no-brainer.

This tactic evolved and eventually helped lead to the inception of many content farms that were targeting traffic solely for their keyword value and visibility.

This was attributed to the “old way” of optimizing a website — for keywords and search engines, rather than users and their intent.

9. Targeting Exact-Match Search Queries

The tactic of targeting exact-match search queries in hopes to rank for those queries solely for the traffic numbers — and not because the search query or its answer actually pertained to the business optimizing for it — became a somewhat popular practice before the full deployment of the Google Knowledge Graph.

Marketers would strive to rank in the top spot for exact-match search queries to trigger a breakout box and an increased click-through rate for their sites.

10. Exact-Match Domains

Having high-value keywords in your URL makes sense. To some extent.

But when it becomes confusing or misleading (i.e., it results in a bad user experience), you have to draw the line.

A main best practice for domains is to keep it consistent with your brand.

Brand names should be short, concise, and somewhat meaningful.

Why wouldn’t you want the same from your domain?

Google would value exact-match domains a long time ago because it made sense to use it as a signal.

The behavioral data now has helped Google make changes like this (and many others) that are common sense, clean-up moves.

Run a good company and offer great products and/or services under the brand name, and Google will do work of making your brand visible when it’s relevant to the people searching for it.

11. XML Sitemap Frequency

We should never try to manipulate search engine crawlers so that our website is crawled more than others because it believed new content was published or substantial site changes were made.

But, since webmasters did that in the past, the sitemap is used quite differently than what was once intended.

Previously, webmasters could give a priority number to each page of a website listed in the sitemap ranging from 0.0 to 1.0.

Since that was never quite used correctly, crawlers don’t even honor the frequency rating.

Instead, search engines just crawl the content it deems it needs to crawl

Make sure you adhere to XML Sitemap best practices. Sitemaps are an incredibly important element for every website.

12. Bad Content

Face it. There was a time in our world when crappy content could still rank well.

Oh, how times have changed.

Stolen content, thin content, keyword-stuffed content, non-credible content — there was a time when all of this could get by search engine crawlers and regurgitated back to users as worthy results.

But no more.

We know what it takes to make quality content that is rewarded by search engines because they tell us what’s right and what’s wrong.

If you want to succeed at SEO today, you must do what’s right.

The Risky Review Schemes That Could Sink Your Business

The Risky Review Schemes That Could Sink Your Business
Considering paying for reviews, getting friends and family to leave reviews, or even a ‘review swap’? Snap out of it! Google Gold Product Expert Jason Brown is here to explain how these schemes could ending up tanking your reviews, and offers some legitimate and proven tactics to generate reviews as alternatives.

Every business wants to increase the number of online reviews that they have. Whether the goal is to have more reviews than the competition, to repair your overall rating or simply to rank in or higher in the map pack, every business is looking into ways to get reviews. But you need to be smart about your strategy or you may find yourself renting reviews.

If Google catches you running an illegal review scheme, and they will, they will delete all of your reviews connected to the review scheme. The FTC also regulates online reviews. Google follows suit and has made review contests a violation of their Terms of Service. Before you stop reading this and say “I won’t get caught,” you need to know that Google receives multiple reports of review schemes every day. Your business could be next.

As a Google My Business Gold Product Expert (formerly the Top Contributor program), I answer business owner’s questions and advise individuals on how to navigate Google My Business issues. On a daily basis, I watch as business after business gets reported for ill-gotten reviews. I’ve seen reports made by marketing professionals, competitors, disgruntled employees, and upset customers.

There is more potential to get caught than there is to hide forever. If you’re like me, and spy on your competition to see what they’re up to, the chances are that one of your many competitors or their marketing company is spying on or monitoring your business.

Review Schemes to Avoid

Review Contests

Review contests are very popular and extremely illegal. The premise of this scheme is to enter the reviewer into a giveaway once they leave a review. I see this a lot with dentists and orthodontists. One dentist ran their review contest twice and both times they were reported to Google.

It doesn’t matter if you say any reviewer can qualify to enter (rather than just positive reviews), the fact that you are offering an incentive for the review violates Google’s TOS and so they will negate the contest.

Get Reviews on Google

The dentist in question more than likely received an email from Google advising them to stop the practice, which says “Please note that it is against Google My Business policies to offer or accept money, products, or services to write reviews for a business or to write negative reviews about a competitor.”

Google Review Email

I would bet that this email was in the process of being sent as the dentist set up the second review contest.

Discounted or Free Services

You cannot offer a reviewer any discount on services or products in exchange for reviews. One business I’m aware of offered all of their customers a 10% savings on their next purchase for leaving a review, so Google went and deleted two years’ worth of reviews.

I’ve also seen a thread where a business thanked everyone with a free drink after leaving a review. Google deleted over 400 reviews. Those 400 individuals still kept their free drink after their reviews were deleted by Google.

Review Swaps

I see review swaps the most in the legal niche. A review swap is basically where “you review me” and “I’ll review you”. I see it a lot when looking at a GMB listings for lawyers. One reviewer, who is also a lawyer, left reviews for several lawyers in different states.

Google’s TOS states, “Your content should reflect your genuine experience at the location and should not be posted just to manipulate a place’s ratings.”

Prohibited and Restricted Content

Review swaps:

a) don’t reflect a genuine experience

b) are posted to manipulate the ratings

When Google sees reports of these types of reviews, they delete them.

Asking Your Friends and Family for Reviews

This is the worst advice out there and it needs to be stopped. As I stated in ‘review swaps’ above, your friends and family reviews are posted to manipulate your ratings.

I see this a lot: a GMB listing has 7 reviews, all posted 8 months ago, and new reviews ever get posted. Potential customers want to see fresh and relevant reviews. Customers want to know how the business currently is and not how they were a year ago.

In their most recent Local Consumer Review Survey, BrightLocal found that 77% of consumers think that online reviews older than 3 months aren’t relevant.

Review-gating

Review-gating is not a new policy, but Google has just reiterated their stance on this practice. Review-gating is when a customer fills out a survey and, if they score high enough, they are asked to post a review online, but if the customer scores the business too low, they are asked to provide private feedback only.

When Google receives reports of businesses review-gating, they delete all of their reviews (not just the ones deemed to violate TOS). Your reputation management tool provider doesn’t get dinged, the business’ GMB listing does. They keep your money while all of your reviews are deleted and gone forever.

Remember that you can’t stop an upset customer from posting negative feedback online. They will find a way to share their experience online. You also need negative feedback so that you can grow and improve your business, and also to make your review profile more believable. (100+ 5-star reviews? Something’s up there.).

Receiving reviews is like going to the doctor for a check-up. The doctor will tell you all the positives and the areas you need to improve upon. If your doctor doesn’t inform you that you need to lower your cholesterol, they are doing you a disservice. You also can’t completely stop an upset customer from sharing their feedback. If they are upset enough, they might report you to Google.

What to Do Instead

All of the above review schemes simply don’t work long-term. While they may have quick results, they merely open up your business to a possible fine from the FTC and review deletion from Google.

Google will and does email businesses involved in illegal review schemes. This is not the attention you want from Google. If you give away a television or an iPad to solicit reviews and Google deletes all of your reviews, you’ll realize you just rented reviews for a short time. It would have been cheaper to sign up for BrightLocal’s new Reputation Management tool.

If an iPad costs $329 USD and BrightLocal’s reputation tool costs $8 USD, a business could safely request reviews for 41 months. That is almost 2 years’ worth of legitimate Google My Business reviews that will remain and won’t be deleted by Google.

When it comes to reviews, I tell all new brick and mortar businesses that they should be getting 5 to 10 new reviews per month. This really isn’t that hard if you train your staff to listen to your customers. If a customer says how great the service is, ask them to share that feedback online and leave your business a Google review.

If a business gets 10 customers a day, that’s 50 to 70 people per week. The odds are in your favor to get at least one of those customers to leave you a review online. It’s the law of averages and it will work out in your favor. You and your staff just need to ask.

You can run a contest among your employees to see who can get the most reviews. This can also get your employees to start focusing more on their customer service skills and the level of service they provide. After all, how are you going to get a review if you don’t ask for it?

Don’t Be Afraid of Negative Reviews

Reviews are about the customer experience. They should never be looked at as “I need X amount of reviews to rank higher, have more reviews than my competitor or to repair my reputation”. That’s the incorrect thinking businesses have when it comes to reviews and that thinking is a recipe for disaster.

If you have a “5 stars or bust” mentality, then when your business gets that one negative review (and it will) it will really upset you. I often see business owners get very distraught over one negative review. They plead their case on the Google My Business forum on how:

  • it’s not fair
  • we have nothing but 5-star reviews
  • it’s not a customer
  • we have no record of the person
  • it has to be a competitor

…and so they respond in a rude and unprofessional manner to the review publicly.

A negative review is an opportunity to plead your case and get the customer to contact you to resolve the complaint. Google notifies the reviewer of your reply too.

The goal of your reply is to persuade the user to contact you and work out a resolution. As consumers are reading more reviews, they are also reading the replies to reviews.  If you sound angry in your reply, it will do more harm than good, and that reviewer will not contact you to resolve the issue.

Conclusion

The bottom line is that your business needs customers to stay in business. If you’re not monitoring your reviews and replying in a polite and professional manner, your potential customers will go elsewhere.

You need to take a deep and serious look at your reviews and address any areas customers are not happy with. One business I have been monitoring for two years officially closed in October 2018. They never addressed the underlying causes of their negative reviews. Instead, they focused on a review scheme to combat the negative reviews. It didn’t work the restaurant wasn’t saved.

Review schemes will not work for your business either. To quote my favorite line from the movie Shawshank Redemption,

“get busy living, or get busy dying.”

Only you can save your business. Will you?

Jason Brown is SEO Manager at Over The Top Marketing and a Top Contributor on the Google My Business forum. He spends his free time battling fake online business reviews. He can be found on Twitter @keyserholiday.

4 Ideas for Your Holiday Marketing Campaign

End of season Christmas sale tags hanging with half price text and with origami paper style for holiday discount promotion.

Depending on your business, there are a number of different campaigns you could run during the holiday season.

The type of campaign you decide to run will depend on the products and services you offer, and the audience you’re trying to reach. You will also need to consider the type of results you’re looking for, and your overall goals for the upcoming season. Constant Contact email marketing and the new Facebook Ads have been most successful.

To help you get started, we’ve compiled a list of possible campaigns you can try out.

Offer a coupon

The key to a great offer is that it’s compelling enough to get people to act. You can add a coupon to any email and let customers redeem in-store or online.

Learn more: Create the Perfect Offer: 4 Questions You Need to Ask First

Plan an event

Hosting a holiday event is the perfect way to thank customers for their continued support. It’s also a great opportunity to interact with your audience face-to-face.

Learn more: How to Make Sure Your Holiday Event Doesn’t Fall Flat

Run a contest

Contests are a great way to engage your audience and can help generate buzz during the holiday season. Come up with a prize that your customers will love, and encourage them to enter by providing their email address.

Learn more: Let Spearhead Multimedia Create Your Holiday-Themed Facebook Promotion

Add value

If running a promotion doesn’t fit your business, you can still do something special for your customers by sending them a thank you email or offering something of value.

Learn more: How to Add Value this Holiday Season without offering a Discount

Google just told us how to fix the worst thing about Androids and iPhones

No matter how strong your allegiance to Android or iPhone is, you’ll probably agree that the worst thing about Android phones and iPhones is battery life. Yes, most of the new phones will get you through the day, and the advantage is clearly on Android, as some vendors have equipped their devices with massive battery packs. But battery life is never enough, especially as the battery degrades over time. Thankfully, Google just told us how to improve battery life on certain Android and most of the new iPhones, admitting a mistake in Android design in the process.

It turns out it’s something as easy as switching to dark mode whenever possible. That’s something smartphone-savvy users have long suspected, that dark mode will help conserve battery life. There is a caveat, however. The screen has to be an OLED one. But that’s absolutely not a problem these days, as most of the flagship devices out there pack OLED screens, premium iPhone X versions included.

Image Source: Google via SlashGear

Google shared data about energy consumption on phones at this week’s Android Dev Summit, SlashGear reports.

The company studied energy consumptions on phones with white and dark themes and concluded that at max brightness, the dark mode on OLED always wins. With OLED screens, each pixel lights up independently, which is why dark mode helps preserve battery life.

Image Source: Google via SlashGear

Google also showed a comparison between the original Pixel and the iPhone 7 which is self-explanatory, as long as you’re aware of the screen differences between the two devices. OLED, on the original Pixel, does consume less power on dark mode compared to the iPhone 7, which has an LCD.

All Pixels since the Pixel 3 come with OLED screens, as do Samsung flagship devices like the Galaxy S or Note, and Apple’s iPhone X, iPhone XS, and iPhone XS Max. But you won’t really find true dark modes for any of them.

Image Source: Google via SlashGear

Even Google admitted that it was wrong to impose white as the predominant color for Material Design apps. Apple’s iPhone UI, meanwhile, is also heavy on white, and there’s no dedicated dark mode on iPhone either. Interestingly, Apple launched a dark mode for Mac, although all Macs have LCD screens, which means it won’t help with battery life. Samsung phones, meanwhile, will get a dark mode via the One UI update, but not all its phones are eligible for it.

Image Source: Google via SlashGear

Just because Google told us how easy it is to “fix” battery life on OLED smartphones, doesn’t mean we’re getting dark modes from either Google or Apple anytime soon. But there may be independent apps that may offer users dark modes, with YouTube being one such example.

Internet Facts to Blow Your Mind

infographic-3

Internet Facts to Blow Your Mind

by Guest Blogger, Louise Harris

 

As quickly as one technology trend arrives, there is another one right behind it, so it is getting increasingly difficult to keep up with all this digital innovation that is readily available at our fingertips.

In the last twenty years, we have gone from the very early stages of mobile phone usage to a world where we can do our grocery shopping with a few clicks on a smartphone. The capabilities of the Internet seem endless and the stats show us just how much impact the Internet has had over the last few years.

This infographic reveals some very interesting digital information that might surprise you. For example, did you know that across the world there are over 4 billion Internet users? A massive 2 billion of that population is located in Asia and there are now 3.2 billion social media users (as of Jan 1st, 2018).

It is hard to imagine a world without the Internet now that it has become so integral to our daily routines. Social media is not just a way for people to connect with friends; it is also a strong business marketing channel with 90% of businesses now actively using social media.

Watching videos on YouTube has become a regular hobby for all generations and particularly the younger generations. There are now more than 1.5 billion YouTube users worldwide and anyone can quickly record a video using their smartphone or create their own tutorial on a webcam.

52.2% of website traffic is now via mobile phones and we have seen changes in website development to reflect this by making websites more mobile friendly. In 2018 over a billion voice search queries per month were recorded and this is a trend that is expected to continue through 2019.

RUSSIA’S ELITE HACKERS HAVE A CLEVER NEW TRICK THAT’S VERY HARD TO FIX

old-style-computer

RUSSIA’S ELITE HACKERS HAVE A CLEVER NEW TRICK THAT’S VERY HARD TO FIX

ALYSSA FOOTE/GETTY IMAGES

By 

THE FANCY BEAR hacking group has plenty of tools at its disposal, as evidenced by its attacks against the Democratic National Committee, the Pyeongchang Olympics, and plenty more. But cybersecurity firm ESET appears to have caught the elite Russian team using a technique so advanced, it hadn’t ever been seen in the wild until now.

ESET found what’s known as a UEFI rootkit, which is a way to gain persistent access to a computer that’s hard to detect and even harder to clean up, on an unidentified victim’s machine. The technique isn’t unheard of; researchers have explored proofs of concept in the past and leaked files have indicated that both the CIA and the independent exploit-focused company Hacking Team have had the capability. But evidence that it has happened, in the form of malware called LoJax, represents a significant escalation in the Fancy Bear—which ESET calls Sednit—toolkit.

In a Flash

If “LoJax” sounds vaguely familiar, it’s because you might recall LoJack—formerly known as Computrace—security software that lets you track your laptop in the event of theft. LoJack turns out to be potent stuff. It sits in a computer’s firmware, making regular calls back to a server to announce its location. Crucially, that also means you can’t get rid of it by reinstalling your operating system or swapping in a new hard drive.


“It allows the attacker to take over the machine and download whatever they want.”

RICHARD HUMMEL, ARBOR NETWORKS


That’s an intentional security feature: If someone steals your computer, you want to make it as hard as possible for them to evade detection. But it also presents a unique opportunity to bad actors, as outlined in a 2016 presentation at a security conference called Zero Nights, and again in more detail this May by researchers at security firm Arbor Networks. Essentially, Fancy Bear figured out how to manipulate code from a decade-old version of LoJack to get it to call back not to the intended server, but one manned instead by Russian spies. That’s LoJax. And it’s a devil to get rid of.

“Whenever a computer infected with a UEFI malware boots, it will place the LoJax agent on the Windows file system, so that when Windows boots, it’s already infected with the LoJax agent. Even if you clean LoJax from Windows, as soon as you reboot, the UEFI implant will reinfect Windows,” says Alexis Dorais-Joncas, ESET’s security intelligence team lead.

It is possible to remove LoJax from your system entirely, but doing so requires serious technical skills. “You can’t just restart. You can’t just reinstall your hard drive. You can’t replace your hard drive. You actually have to flash your firmware,” says Richard Hummel, manager of threat intelligence for Arbor Networks. “Most people don’t know how to do that. The fact that it gets into that spot where it’s really difficult to use makes it really insidious.”

Most antivirus scanners and other security products also don’t look for UEFI issues, making it even harder to detect whether malicious code is there. And if it is, you’re in trouble.

“Decade-old software and hardware vulnerabilities are easily exploited by modern attackers, so companies must use good endpoint hygiene best practices including ensuring endpoints and firmware are up-to-date, leveraging anti-malware, and confirming other endpoint protection agents are always present and healthy,” says Dean Ćoza,  executive vice president of products at LoJack developer Absolute. “We take the security of our platform extremely seriously, and are working to confirm these issues do not impact our customers or partners.”

Takeover

The malware ESET observed does not itself actively steal data from an infected device. Think of it not as a robber, but as a door into your house that’s so hidden, you can’t see it even if you pore over every wall. LoJax gives Fancy Bear constant, remote access to a device, and the ability to install additional malware on it at any time.

“In effect, it allows the attacker to take over the machine and download whatever they want,” says Hummel. “They can also use the original intent of the malware, which is to track the location of the infected machines, possibly to specific owners that may be of interest to the attackers.”


“Probably more attacks will take place.”

ALEXIS DORAIS-JONCAS, ESET


Several details about the Fancy Bear UEFI attack remain either vague or unknown. ESET’s Dorais-Joncas confirmed that the device they spotted it on was “infected by several pieces of malware,” and that the hacking group targeted government entities in Europe. They don’t know exactly how Fancy Bear hackers gained access to the victim’s device in the first place, but Dorais-Joncas suggests that they likely followed their typical strategy of a spearphishing attack to gain an initial foothold, followed by movement through a network to locate more high-value targets.

The security firm has more specificity, though, in terms of how exactly Fancy Bear operated once it got that initial control. First, the hackers used a widely available tool to read the UEFI firmware memory, to better understand what specific device they were attacking. Once in possession of that image, they modified it to add the malicious code and then rewrote the infected image back to the firmware memory. The process was not automated, says Dorais-Joncas; a human behind a keyboard went through every step.

Those details offer some hope for future potential victims. Namely, the attackers were only able to write onto the target computer’s firmware in the first place because it was an older device; Intel and others have baked in better protections against that behavior, especially after the Hacking Team and CIA revelations. Using the Windows Secure Boot feature, too, would prevent this type of attack, since it checks to make sure that the firmware image on your computer matches up with the one the manufacturer put there.

“On the other hand,” says Dorais-Joncas, “probably more attacks will take place,” given that Fancy Bear has figured out how to do it successfully. And now that it’s widely known that Fancy Bear did it, copycats may not be far behind.

“Whenever we see these new tactics, it does not take long for other hackers to figure out how they did it and to mimic it,” says Hummel.

Russia’s hackers already have an elaborate hacking toolkit. But the introduction of a UEFI rootkit—stealthy, complex, pernicious—affirms just how advanced their capabilities have become. And more importantly, how hard they are to defend against.

The Best Reason to use a Professional WordPress Developer

wordpress-locked

Thousands of WordPress sites backdoored with malicious code

Malicious code redirects users to tech support scams, some of which use new “evil cursor” Chrome bug.

 


Thousands of WordPress sites have been hacked and compromised with malicious code this month, according to security researchers at Sucuri and Malwarebytes.

All compromises seem to follow a similar pattern –to load malicious code from a known threat actor– although the entry vector for all these incidents appears to be different.

Researchers believe intruders are gaining access to these sites not by exploiting flaws in the WordPress CMS itself, but vulnerabilities in outdated themes and plugins.

Also: Access to over 3,000 backdoored sites sold on Russian hacking forum

When they gain access to a site, they plant a backdoor for future access and make modifications to the site’s code.

In most cases, they modify PHP or JavaScript files to load malicious code, although some users have reported seeing modifications made to database tables as well.

Malwarebytes security researcher Jérôme Segura said this malicious code filters users visiting the compromised sites and redirects some to tech support scams.

CNET: How to avoid tech support scams

He says some of the traffic patterns seen during the redirection process match the patterns of a well-known traffic distribution system used by several malware distribution campaigns.

Segura also said that some of tech support scams that users are landing on are using the “evil cursor” Chrome bug to prevent users from closing the malicious site’s tab, a trick that the researcher first spotted last week.

TechRepublic: Why that email from your boss could be a scam waiting to happen

This WordPress site hijacking campaign appears to have started this month, according to Sucuri, and has intensified in recent days, according to Segura.

Googling just one of the pieces of the malicious JavaScript code added to the hacked WordPress sites reveals just a small portion of the total number of hacked sites. In this case, this string search yielded over 2,500 results, including a corporate site belonging to Expedia Group, the parent company behind the Expedia portal.

wp-spam-campaign.png

Last week, ZDNet revealed that attackers had been scanning the Internet in an attempt to exploit a recent vulnerability in a popular WordPress plugin.

While Sucuri did not find confirmation that this vulnerability was now being used in this recent wave of site hacks, the company did confirm our initial report, based on WordFence’s telemetry.

Contact Spearhead Multimedia today and get your free WordPress Website security evaluation.

We offer special incentives for new clients who want to move to a new, secure host, update and harden their WordPress websites and create new WordPress websites.  Call 954-202-8004 or use the Contact Us form.

Mobile-First Indexing: Your Guide to Google’s Big Shift

Google-mobile-indexing

Mobile-First Indexing: Your Guide to Google’s Big Shift

 By 

As Google makes the big change to mobile-first indexing, it’s important that your site is ready for the shift. Are you fully prepared?

Let’s start at the beginning.

What Is Mobile-First Indexing?

The mobile-first initiative is an effort to address the growing percentage of mobile-users in today’s search landscape.

Back in March, on their Webmaster Central Blog, Google announced that they are rolling out their mobile-first indexing initiative more broadly which is a big change to how Google crawls and indexes your site. The push is on now and Mobile Indexing is being fully implemented.

What’s Changing about Google’s Rankings?

Per Google, “Mobile-first indexing means Google will predominantly use the mobile version of your websites content for indexing and ranking.”

But what does that mean?

Currently, Google crawls and indexes your site based on the desktop version of your site and the content that exists there.  With this change, Google will be looking at your mobile site first and the content on that version to determine how your site is ranked.

For example:

Desktop vs. mobile versions of your site; Google will now index the mobile version of your site.

Over the course of the last year, Google has been slowly experimenting with a small percentage of sites to make the switch to crawling, indexing, and ultimately ranking sites based on their mobile experience, not their desktop as they always have.

This doesn’t mean your desktop site isn’t important anymore, it just means that they will be looking at it as a secondary source, not the primary one for crawling, indexing, and ranking as it has been in the past.  But even if your site is doing well organically, if it’s not responsive (mobile friendly), your ranking will drop substantially.  Don’t lose those years of building your search engine position, contact us today.

How Mobile-First Indexing May Impact Your Site

Depending on how you handle mobile, this change may or may not directly affect your site.

  • If your site is built in responsive design, you will see no impact, as your site adapts to all devices.
  • If you have a separate m. site (or something similar) and your primary content does not exist on it, then you are at risk of seeing a negative impact as Google will no longer be looking at your desktop version.
  • If you do not have a mobile site/experience then this change will negatively impact you.  Also, it’s 2018: if you don’t have a mobile-friendly site then you have much larger issues that this change.

What Mobile-First Best Practices Can I Follow To Ensure I Maximize My Opportunity?

Google has published an entire list of best practices for mobile-first indexing on their developers’ blog.

While there are many things to consider and you should read through the entire list above, two major points are ensuring you have mobile-friendly content and that your site loads as fast as possible.  Site speed is becoming an increasingly important ranking factor, which coincides with users’ needs to get everything as quickly and seamlessly as possible.  With the rapid adoption of AMP (accelerated mobile pages) and the popularity of Progressive Web Apps (PWA’s) growing, it’s not surprising to see Google pushing site owners in this direction.

How Do I Know If Google is Using Mobile-First Indexing for My Site?

Google will be notifying site owners that their sites are migrating to mobile-first indexing through Search Console.  The message will look like this:

Example of Google's notification of mobile first indexation

So you need to make sure that if you have an m. version of your site, it is verified in Search Console.

You will also see a significant increase in the Smartphone Googlebot crawl rate and Google will show the mobile version of pages in search results and cached pages.

What Do We Think About This?

This is a major change in how Google interacts with our websites and makes sense as more and more traffic continues to move to mobile.  While your desktop site will certainly remain important and Google will not be ignoring it, users have been trending towards mobile usage for years and this is the natural progression of our industry.

Companies need to take notice of this change.  Thinking mobile-first should not be something that is kicked down the road and moved down on priority lists, from a search perspective this should be top of mind for all organizations large and small.

Should you be concerned?  If you haven’t been paying attention to how your site functions on a mobile device, this probably isn’t going to pan out for you.  The good news is that all websites are living documents and can be changed and updated.  If you are coming in a little late to the game on mobile, then now is the time to improve that experience and ensure your site is set up to provide value to mobile users.

This is yet another banner that Google is waving to signal the importance of your mobile experience.  If you have been neglecting it, now is the time to rectify that and putting people and resources behind it.

If you think your site is not mobile friendly or have tested it and know, contact us for advice to bring your website up to speed with the current technologies.


What is the dark web? The good and bad of the Internet’s most private corner

ProtonMail-What-is-the-dark-web-diagram-2

What is the dark web? The good and bad of the Internet’s most private corner

You may have heard the dark web is a place for drug dealers and hitmen. That’s correct, but there’s more to it than that. In this article, find out what is the dark web, how to access it, and what you might find there.

The dark web is a part of the Internet that requires special software to access and is not indexed by search engines. It offers much greater privacy than the widely accessible parts of the World Wide Web.

That privacy also makes the dark web a setting for illegal activity, scams, and offensive content. The high-profile rise and fall of the Silk Road marketplace for illicit drugs is the best-known example of this. But despite the sensational media coverage, few people really understand what the dark web is or how it works. For instance, it might surprise some people to learn that The New York Timesand Facebook both maintain websites on the dark web.

The dark web isn’t “dark” because it’s bad; it’s dark because it’s the only place on the Internet that offers a bit of privacy. In this article, we’ll explain how that works, what actually happens on the dark web, and how you can check it out for yourself.

What is the dark web?

Think of the Internet as divided into three parts: the clearweb, the deep web, and the dark web.

The clearweb is the Internet most of us are familiar with. Its pages are searchable in Google, but it makes up just a small percentage of all the content on the Internet. The deep web comprises the majority of the Internet, but it is not indexed by search engines, it is often password-protected, and therefore it’s not generally accessible. The deep web includes things like financial databases, web archives, and password-protected pages.

The dark web is a small portion of the deep web. It runs on top of existing Internet infrastructure, but it is a parallel web that cannot be accessed without special tools. For this reason the dark web is sometimes referred to as the hidden web.

Websites on the dark web have domains ending in “.onion” and are sometimes known as onion sites. They’re called onion sites because of the kind of encryption technology they use to hide the IP address of the servers that host them. Websites on the dark web mask their data behind multiple layers of encryption (like the layers of an onion), and can only be accessed through the Tor network, which is a network of computers around the world maintained by volunteers. Because the routing is random and the data is encrypted, it’s extremely difficult for anyone to trace any piece of traffic back to its source.

How to access the dark web

Tor is the most popular dark web interface, with millions of users. There are a number of ways to access the Tor network, including via the Tor browser , the operating system Tails, or by installing Tor on your computer. ProtonVPN also provides one-click Tor access through the Tor over VPN feature. From there, you can browse the web normally as well as gain access to highly private and secure onion sites.

Unlike the regular web, however, even after you have connected to the dark web, it isn’t so easy to find websites. Dark web sites use randomly generated domains that aren’t easy to remember. The dark web is also difficult to index, meaning search engines are ineffective. There are a number of link directories, such as The Hidden Wiki, that attempt to catalogue the dark web. But because dark web sites change their domain frequently, you’ll find a lot of dead links. A typical onion site url looks something like this:

http://3g2upl4pq6kufc4m.onion/

Some special onion sites, though, have easy to remember domain names and also SSL encryption (URLs that start with “https” instead of “http”). For example, ProtonMail’s Tor encrypted email site is at https://protonirockerxow.onion while Facebook’s onion site is at https://facebookcorewwwi.onion. You can learn more about these special onion sites here.

What’s on the dark web?

The illicit uses of the dark web are well documented: assassination services, ecommerce sites for buying guns and drugs, and so on. It’s best to stay clear of anything that seems suspect while browsing there. However, there are plenty of 100% legal things you can do on the dark web. You can read ProPublica or The New York Timescheck your email in ProtonMail, or browse your Facebook wall. All of these mainstream websites offer dark web access because of the benefits to privacy and freedom of information.

One of the biggest advantages of the dark web is the difficulty of blocking it. Common forms of censorship, which block traffic to websites at specific choke points along the Internet hierarchy, do not work with encrypted overlay networks. (As a result, some dictators have, for example, tried to block Tor itself.)

For similar reasons, the dark web is more resistant to surveillance by governments and corporations (such as Internet service providers). Whistleblowers, journalists, and other professionals at risk of targeted surveillance use the dark web to communicate sensitive information. And organizations including Human Rights Watch and the Electronic Frontier Foundation support the use of and access to the dark web.

One of the only drawbacks of the dark web is its speed. For instance, because Tor bounces your traffic through multiple servers around the world, it necessarily slows your connection. But when you need it, the dark web can be vitally important: When Turkey temporarily blocked ProtonMail for some users, our onion site was one of the only ways people could gain access to email.

So, there’s no reason to be afraid of the dark web. On the contrary, the dark web is an essential privacy tool. As governments work to weaken encryption with backdoors and corporations gain greater access to everything we do, privacy and security technologies like the dark web must be vigorously defended. And that starts with understanding them beyond sensational headlines.

Best Regards,
The ProtonMail Team

You can get a free secure email account from ProtonMail here.

We also provide a free VPN service to protect your privacy.

ProtonMail and ProtonVPN are funded by community contributions. If you would like to support our development efforts, you can upgrade to a paid plan or donate. Thank you for your support!

Font Resize