fbpx

Telemarketers Just Got Harder to Stop

New technology allows users to leave voicemail without phone ever ringing

Telemarketer Voicemails

ERIK KHALITOV/GETTY IMAGES

Developers of the backdoor voicemail argue that the “do not call list” does not apply

We have all received them, on our home phone or cellphone — a telemarketer trying to sell us a product or service. Some of us simply ignore the call, others answer and quickly hang up, while some do listen to the telemarketer’s message. Soon, however, we might not have any of those options; telemarketers have a direct way into our voicemail.

Ringless voicemail is a new technology that allows users to leave you a voicemail through a back door, without the phone ever ringing. There is growing concern that this capability can allow telemarketers to flood your voicemail, causing you to miss important messages.

The technology has been successfully used for hospitals, schools and churches, and developer Josh Justice, CEO of Stratics, says he believes it can be a success in other ways. Justice told NBC News: “Ringless voicemail drops are a non-nuisance form of messaging and are an alternative to robocalls. It really does put the power in that consumer’s hand where they can essentially listen to the message or not listen to the message.”

There are consumer protection laws that restrict some telemarketing, but it’s unclear if ringless voicemail falls under the restrictions. The providers of the technology and business groups contend that since the phone doesn’t ring, it’s not a call — and therefore exempt from the current laws, the New York Times reported.

A provider of the service has already filed a petition with the Federal Communications Commission to officially allow it. The commission has been accepting public comments on the issue, but hasn’t given a timetable of when it would make a decision.

Politicians are divided on the issue, as it could also restrict their use of the service for campaign purposes.

As of now there is no way to block the unwanted voicemails. Phones don’t yet have a spam feature comparable to those on emails, and developers of the backdoor voicemail argue that the “do not call list” does not apply. You can comment on the petition, or contact the FCC to file a complaint.

What Google’s New SEO Algorithm Means to Your Website.

Google’s New SEO Algorithm Looks for This When Ranking Sites

The rules of SEO are constantly evolving.

Search engines like Google update their algorithms so frequently, and it can be dizzying to know how to get your site ranked the way you want. One thing is clear, however: keywords just aren’t enough to get you the traffic you want.

So what will give you results when it comes to SEO? According to Google’s latest algorithm, Hummingbird, you need to build sitewide trust.

Trust is the core component of Google’s relevancy-oriented search, and without it, you won’t be relevant.

Building real trust with Google isn’t easy. The smarter the system gets, the harder it can be to rank. But don’t worry just yet. There are things you can do to ensure that your site still ranks the way you want it to.

Why Google’s SEO Algorithms Matter

Google’s algorithm rules aren’t arbitrary: they have a purpose. Before you can improve your SEO ranking, you have to understand the ultimate motivation.

Google’s main goal is to deliver the most relevant search results as fast as possible.

They use deep neural networks of data to create a system that can think like the human brain, or attempt to, anyway. This approach is called “deep learning” and it’s used all across the Internet to improve user experience.

It’s ultimately an effort to help computers process information the same way humans do.

So when you search for “best website design ideas,” you get results based not only on your query, but on your search history, what other people are searching for, and what sites have content that closely resembles what the engine thinks you mean.

The smarter that the algorithms get – the more humanlike – the harder it is to “game” the system. Plugging your site with random keywords doesn’t work anymore, because Google can see through your attempt to keyword stuff.

Instead, you have to get Google to trust you. How do you do this?

In a book entitled SEO 2017: Master Search Engine OptimizationR.L. Adamslays the groundwork: You build trust with age, authority and content.

Building Trust with Age

You may think that using age as a ranking factor puts newer sites at a disadvantage, but know that with Google, age is more than a number.

Google relies on its relationship with your site over time to judge whether or not you’re trustworthy enough to list on the first few pages. Time is still a factor – the longer it knows you exist, the more likely you will be to rank – but if it sees that you produce value for visitors over time (you have heavy traffic, your site gets linked to, you produce frequent content, etc.), your relationship will improve.

For sites that have been around longer, this gives you an automatic boost to your rankings, which may come as a relief. For newer sites, or those that post less frequently, you will still have to build up your reputation over time.

Keep in mind that age doesn’t necessarily mean when you launched your site, though. Age refers to the indexed age, meaning when Google actually discovered you first. So if you had a site for a while but haven’t done anything with it until now, you will still be a baby in Google’s eyes.

Building Trust with Authority

If you don’t have age in your favor, you can also boost your ranking with authority.

In the past, you would build authority through your Google PageRank. The higher on the scale of 1-10 your site sat, the more trusted it would be. If you could link to more established (higher ranking) sites, you could boost your own score.

While Google still uses PageRank as a factor in SEO, they no longer gives public access to PageRank ratings, making it impossible to know how you actually fare. Instead, Google uses Domain Authority to determine the trustworthiness of your site.

Domain Authority is a score (100 points) developed by Moz that predicts how well a website will rank on search engine result pages (SERP). While it’s not a direct replacement for PageRank, it does allow you to see where your site sits in the rankings.

What makes Domain Authority helpful is that it gives you a way to measure the strength of your links. You can see exactly which sites are giving you the best boosts and which links are altogether worthless for your ranking power.

A few ways to improve your Domain Authority include:

  • Optimizing your internal links – Making sure links go to relevant content, use natural anchors that make sense to users, are linked to the right keywords, etc.
  • Creating more link-worthy content – Avoiding keyword stuffing, but creating content that does link out to other sites
  • Pursuing higher quality links – Linking to trustworthy (older, more established) sources, putting your site in a directory like Google My Business, Yelp, TripAdvisor and the Better Business Bureau, etc.
  • Running link audits – Eliminating broken or bad links as often as possible

The better links you have (the better your link profile is), the better your Domain Authority will be.

Building Trust with Content

The other thing that Google looks for when building trust is fresh, quality content.

When you publish quality content on a regular basis, you give Google more opportunities to index your site for links as well as for targeted keywords (yes, keywords still matter).

Frequently adding content, like blogs or articles, allows you to optimize the article with pertinent keywords that can attract visitors to your site, and provides you additional ways to link to authoritative sources and higher ranking sites.

The trick is that your content has to deliver genuine value. In the past, Google’s algorithms would look at the number and frequency of keywords being used throughout the content on your site to determine relevancy.

But the trouble with this is that Google’s new algorithms actually punish keyword stuffing. Instead, the algorithm looks for specific keywords or keyphrases (even “natural language” search phrases and questions) that fall into the content naturally.

In other words, the keywords have to make sense in context – and yes, Google can tell.

This means that for content to help you build your credibility, it has to:

  • Be popular enough to attract traffic
  • Include relevant keywords naturally
  • Provide enough value that users share and save it
  • Include meta tags, title tags and descriptions
  • Be published frequently

The good news is that you can publish as much content as you want, as long as it’s high-quality. This can be one of the best strategies for newer sites looking to rank higher in SERPs, since Google will still build a relationship with your content even if you haven’t been around for long enough to have age or link authority.

Final Thoughts

If you want to create a site that ranks under Google’s new SEO algorithm, you have to focus on building a relationship with Google and steer clear of smarmy tactics like keyword stuffing or over-linking.

Thanks to Artificial Intelligence, Google thinks and acts more like a human when it processes your site, meaning that, in a way, it’s judging what you have out there.

In order to make sure it trusts your content, you want to produce content that offers value for searchers, build natural links and relationships with other high-ranking sites, and stick around long enough for Google to see you.

Ask us what we can do to help.  It’s easier than you think.

Why you should still include ‘Sent from my iPhone’ in your mobile signature

Those four little words reveal more than you think

The blog of researcher, writer and speaker Rob Ashton

While conducting some research recently, I discovered a question in a web forum that got me thinking. In a nutshell, the question was: should you include ‘Sent from my iPhone [or Android phone etc.]’ at the foot of an email if you’re composing it on a mobile device?

I confess that, until a few weeks ago, I’d assumed such questions were now redundant. Smartphones and tablets are hardly new. Surely by now we’re all over the ‘Look at me with the latest piece of tech wizardry’ thing, aren’t we?

In fact, couldn’t such a line in an email signature even backfire? After all, it’s a simple enough task to customise it or even remove it altogether. Leaving it in would therefore suggest that you were actually a little, well, technologically challenged.

But then the offending line reappeared in my own iPhone signature, after a software update. Mildly irritated, I resolved to customise it as soon as I had a couple of minutes.

Two weeks later, I still hadn’t updated it. By then though, I was beginning to wonder if there might actually be an advantage to leaving it there.

After all, surely letting people know that I was emailing on the hoof would buy me some leeway when it came to the odd typo or malapropism (at least ‘for all intensive purposes’, if not ‘kind retards’).

It’s not just me – or you

Intrigued, I started doing a little digging and soon found I was not alone, which is how I discovered the forum question.

At the time of reading, the question had attracted 35 responses. A little more rooting around revealed a Guardian article on the same subject that was followed by no fewer than 590 comments. Clearly, it wasn’t just me who was unsure – nor the person who posted the original forum query.

The response reflected a range of views similar to how my own had changed over time. Some people were adamant that you should remove that line altogether, if only to show that you were not a Luddite and incapable of using anything other than default settings.

One person even argued that email signatures don’t matter at all; in fact, they were a distraction from the message and best left off. I would certainly argue strongly against that advice. At the very least, a signature should contain a phone number unless you specifically don’t want your correspondent(s) to know it. I’ve often cursed the lack of this information in an email when I needed to contact someone urgently – say, to explain that I was running late or even to place some business. (This has resulted in potential suppliers losing sales on more than one occasion.)

But opinion generally seemed divided between those who thought the line irrelevant and those who thought it important in setting context and therefore how much detail you should expect.

Clearly there was still some confusion, so I went in search of a better answer. I wondered if there had even been any definitive research on the topic.

The science of sizing people up

There had – and the results were pretty intriguing.

The short answer to the question of whether you should write ‘Sent from my iPhone’ is: yes, you should. Or, at least, you should indicate that you’re sending the message from some sort of mobile device.

But the reason why is longer. Not only that, but it’s the key that unlocks a fascinating area of communication science. Knowledge of that science can enable you to improve everything from a response to a customer-support request to a bid for a contract worth many millions.

The research area is called uncertainty reduction theory  (URT). It’s far from a new idea: it was first formulated by social scientists back in 1975. Yet, unless you’re an academic yourself, I doubt you will have heard of it. Certainly, I’ve yet to find it in any book on communication aimed at business or the general public. (I’m working on a fix for that.)

The central idea of URT states that our primary aim in any initial interaction with people is to reduce uncertainty about them. In other words, we want to check that they are what (or who) they say they are, that they have our best interests at heart or that they really will help us having said they would.

This is such an established idea among academics that dozens of them have expanded on or qualified it (for example, to apply it beyond just initial interactions). But the core concept remains firm.

If you think that’s a cynical view of human interaction and that we should have more faith in humanity, bear in mind that you probably carry out this checking process all the time. It’s just that the mechanisms are so ingrained that you may do it very quickly and even subconsciously.

Our main way to reduce uncertainty is through communication, so we have more than one in-built way to work out what’s true and what isn’t whenever someone is sending us a message – be that in writing or verbally.

Communication reduces uncertainty

We’re primed to look for clues – or cues – either that all is well and we can continue with the interaction or that we need to be sceptical and proceed with caution.

Often we send out these cues unintentionally. Many of them we can’t even control very easily, and people we communicate with use those cues. Humans are hard-wired to place a high value on them, according to an area of research allied to URT called warranting theory, which calls these most valuable signals ‘high-warrant’ cues.

Those signals that we can easily manipulate (such as our words) are called low-warrant cues. And we use high-warrant cues to decide how much notice we should take of low-warrant ones.

By now, you’re probably beginning to realise that this is a pretty big deal. After all, if we’re all programmed to look out for signals that those around us have little control over, it could explain why communication so often fails.

Taking control of communication

Note though that high-warrant cues are those we can’t control very easily. That doesn’t mean we can never control them. Some are just things that we think don’t matter much and so don’t pay much attention to.

And that means that, if we work out what those high-warrant yet controllable cues are, we’ll be able to tweak them and begin to (perhaps radically) improve the success rate of our communications.

All of which leads us back (at last) to ‘Sent from my iPhone’. Because, although that’s something that most of us now know how to edit or switch off, that’s not always been the case.

In 2012, two researchers, Caleb Carr and Chad Stefaniak, decided to test the effect of including this phrase in an email signature. It was five years after the first iPhones were introduced, and this signature line was still very common in messages. The reason it was still common was that many people didn’t know how to change it – in other words, it was a high-warrant cue.

Riddled with errors

In their study, they particularly wanted to test how that cue in an email affected perception of its sender and its sender’s organisation. To do so, they recruited a group of 111 people and showed them one of four forms of the same, basic message. The four versions contained a combination of either multiple errors or no errors and a ‘Sent from my iPhone’ signature or just the sender’s name and organisation.

Now, many of the errors were far from subtle. When I read the original paper, I spotted no fewer than 12 mistakes in the uncorrected example used. They included incorrect capitalising in the name of the sender’s employer, numerous missed apostrophes and sentences that ended with no full stop. The researchers clearly didn’t want to risk participants failing to pick up on these cues.

The message purported to be from an HR director. And participants were asked to rate the sender’s credibility as well as their competence and the prestige of the sender’s employer.

The results? Not surprisingly, the errors had a damaging effect in all three of these areas. But, despite the number of mistakes, the presence of ‘Sent from my iPhone’ significantly reduced that damage.

Smart move to get readers on side

The results do at least prove that, if you indicate you’re sending a message from your smartphone, your reader will generally forgive the odd mistake.

And this stuff matters. Almost nine out of ten smartphone owners (88 per cent) use their phones to send or receive email, according to a survey by the Pew Research Center. This makes email one of the smartphone’s most popular features. Unlike with text messaging, however, the medium used to compose an email is not obvious unless you make it so. And while we forgive typos in a text, we’re less lenient with emails.

But the implications of this and similar studies go way beyond showing that it’s a good idea to indicate that you are emailing from a mobile device. Because they show that the unintentional cues we send out when we write or speak have a huge impact on how our audience perceives what we’re trying to say.

In communication, first and foremost, it’s the little things that count.

This is How Top Bloggers Get 90% of Their Traffic

Get your SEO strategy figured out, then go crazy creating content.

This is How Top Bloggers Get 90% of Their Traffic

Image credit: lechatnoir | Getty Images

Related: 3 Essential Tools to Utilize When Starting a Company Blog

Social networks, for example, can be a great way to drive traffic to your blog. But they are not the most dominant force out there. Similarly, advertising on social networks can be effective, but only for the length of time you are running them and shelling out cash. It’s yet another machine which, with rare exceptions, does not compound upon its success. This is why the vast majority of top bloggers, even those spending a lot of time promoting themselves on social media, will tell you that social media is not where they get the majority of their traffic from.

An influencer in this space who knows this all too well, Darren Rowse, runs ProBlogger, a website with a huge following that teaches bloggers how to create and grow their blog. I heard him speak to this issue of — where bloggers get traffic — on his podcast, so I reached out to him to get more detail.

“Most bloggers that I talk with admit to focusing most of their promotional efforts on social media,” he said. “However, when you dig into where most established bloggers get the majority of their actual traffic, the answer I often hear is from Google. It seems to me that many bloggers are overlooking one of the biggest and most lucrative sources of traffic: search. The lure of viral traffic from social is strong but if bloggers put a little bit of time each day into their search strategy instead, I believe they would be far more successful.”

Search engine optimization, or SEO, is really the only traffic-driving force that has the potential to one day cross the bell-curve and work passively in your favor. The chances of someone stumbling across your social post from a year ago and sharing it with their audience, for example, is highly unlikely. And yet year-old blog posts constantly find their way to the top of search queries and continue to bring in big ticket traffic for websites that understand the value of quality content.

For example, according to a study by Hubspot, “66 percent of marketers say improving SEO and growing their organic presence is their top inbound marketing priority. Similarly, a report by Ascend2 stated, “72 percent of marketers say relevant content creation was the most effective SEO tactic.”

Marketing and SEO expert Neil Patel attributes his blog’s 206 percent traffic increase to the art of search engine optimization and the effectiveness of creating valuable content for the web.

So, what are the steps to creating content with an SEO strategy in mind?

1. Start with relevant keywords, and search for the low-hanging fruit.

You need a firm grasp on what people in your industry or niche are searching for in order to create successful content. A few ways you can do this:

  • Use Google’s keyword planner, or a tool like Ubersuggest.
  • Search keywords on Quora, and look for what questions people are asking.
  • Do a few web searches with those related keywords to see who is currently dominating the first two pages of search.

Once you have a sense of what people are searching for surrounding your area of expertise or interest, you can start to cater your content toward the keywords that are not as competitive. For example, ranking on the first page for “social media” is going to be much harder than if you were to try rank for something more targeted like “real estate social media strategy.”

Related: Why You Should Republish Old Blog Content

2. Create long-form content for better searchability.

Marketers often talk about how today’s online readers have short attention spans, but I don’t buy this for a minute. Readers don’t hate long form content, they hate bad content. They hate bad content even more when it’s long. If the content is great content, then they want even more of it.

A study by Buzzsumo found that long-form content between 3,000 and 10,000 words ended up performing the best online. In fact, according to the study, “There are 16 times more content with less than 1,000 words than there were content with 2,000+ words.”

What this means is that trying to stand out with short-form content in a world of saturated short-form content is extremely difficult. However, if you come in wielding long-form, keyword specific, valuable content, you are far more likely to rise to the top of the rankings and accumulate more organic search traffic. Just make sure it’s great content people actually want to read.

3. Establish a network of backlinks from other websites.

If understanding the landscape is step one, and creating valuable content is step two, then step three is expanding your reach and having other blogs and websites point to your website via backlinks. According to Hubspot, “Companies that blog have 97 percent more inbound links.”

Here are a few ways you can get websites and content publishers to link back to your content:

  • Reach out via email to relevant content publishers in your space or market, and let them know about your content piece. Ask them if it’s a good fit for their audience, and if so, to feel free to share it.
  • Create a similar piece of content for another website, and link back to your own content as an added resource or reference.
  • Quote or otherwise include relevant content creators in your space in your content, and when you publish it, tag them in your social media posts with the article. Do you think they’ll share it? Of course they will!

The key is to get what you’ve created in front of the right people, whether that’s through email outreach, social media or even good old fashioned networking.

Related: How Real Marketers Create Backlinks That Matter

With SEO there’s bad news, and there’s good news. The bad news is that SEO is a long-term strategy, which means you’ll need to do a lot of work for a long time to get consistently great results. The good news is that because it’s a long-term strategy, most of your competitors won’t focus on it, and then you win.

 

From: https://www.entrepreneur.com/article/290894

The New Chrome and Safari Will Reshape the Web

By 


APPLE AND GOOGLE are cracking down on obnoxious online ads. And they just might change the way the web works in the process.

Last week Google confirmed that Chrome—the most widely used web browser in the world—will block all ads on sites that include particularly egregious ads, including those that autoplay videos, hog too much of the screen, or make you wait to see the content you just clicked on.

Apple meanwhile announced yesterday that Safari will soon stop websites from automatically playing audio or video without your permission. The company’s next browser update will even give users the option to load pages in “Reader” mode by default, which will strip not only ads but many other layout elements. The next version will also step up features to block third parties from tracking what you do online.

But the two companies’ plans don’t just mean a cleaner web experience. They represent a shift in the way web browsers work. Instead of passively downloading and running whatever code and content a website delivers, these browsers will take an active role shaping your web experience. That means publishers will have to rethink not just their ads but their assumptions about what readers do and don’t see when they visit their pages.

For years, browsers have simply served as portals to the web, not tools for shaping the web itself. They take the code they’re given and obediently render a page as instructed. Sure, browsers have long blocked pop-up ads and warned users who tried to visit potentially malicious websites. But beyond letting you change the font size, browsers don’t typically let you do much to change the content of a page.

“Browsers have always been about standards and making sure that all browsers show the same content,” says Firefox vice president of product Nick Nguyen. “It’s been a neutral view of the web.”

The problem is that this complacency has led to a crappier web. Publishers plaster their sites with ads that automatically play video and audio without your permission. Advertisers collect data about the pages you visit. And criminals sometimes use bad ads to deliver malware.

 Many people have taken the matter into their own hands by installing plugins to block ads or trackers. About 26 percent of internet users have ad blockers on their computers, according to a survey conducted by the Interactive Advertising Bureau. Some 10 percent have ad blockers on their phones.

Now browser-makers are starting to build these types of features right into their products. Firefox added tracker-blocking to its private browser mode in 2015, and Opera added an optional ad-blocking feature last year. Meanwhile, newer companies like Brave and Cliqz have launched privacy-centric browsers of their own.

Now, thanks to Apple and Google, this trend is going mainstream. About 54 percent of all web surfers used Chrome last month, according to StatCounter, and about 14 percent used Safari. In other words, nearly all browsers will at the very least let users curb the worst ads on the sites they visit. And websites will have to adjust.

The Business of Blocking

It might seem weird for Google, one of the world’s largest advertising companies, to build an ad-blocking tool right into one of its core products. But the search giant may be engaging in a bit of online judo. Google only plans to block ads on pages that feature types of ads identified by an ad-industry trade group as the most annoying. Google may be hoping that stripping out the worst ads will eliminate the impetus to download much stronger third-party ad blockers that also block its own ads and tracking.

Apple, which doesn’t depend on advertising revenue, is taking a more radical approach. In addition to blocking cookies that could be used to track people across sites, the company will also give users the choice to display only the main content of a page, throwing out not just ads but extras like lists of “related stories” and other enticements to stay on a particular site. The page’s prescribed fonts and color scheme get thrown out as well.

Safari has offered the reader view as an option since 2010, but traditionally you’ve had to load a page before you can turn the option on. Letting people turn it on by default means they could visit pages and never see the original versions. That’s a big change that goes well beyond ad-blocking. It means that a page’s code could soon act more as a set of suggestions for how browsers should present its content, not a blueprint to be followed as closely as possible.

That doesn’t just change the way companies have to think about ads. It changes the relationship between reader and publisher—and between publishers and browser makers. For example, Brave—the privacy-centric browsing company founded by Firefox creator Brendan Eich—hopes to essentially invert the advertising business model by having the browser, not the webpage, serve up ads, then share the revenue with publishers. That’s just one new model that this new paradigm makes possible, whether publishers like it or not.

Google Attribution: A Huge Analytics Change Is Coming Your Way

What Is Google’s New Event Search Feature And Why Does It Matter?

Jayson DeMersJayson DeMers , CONTRIBUTOR
I de-mystify SEO and online marketing for business owners.

 

What Is Google’s New Event Search Feature And Why Does It Matter? – Image Source: Pexels.com

Google has never been a company to rest on its laurels. Over the past two decades, the search engine giant has continued surprising, delighting, and serving its users with new features, layouts, and inner workings. Not all of these features have been successful, but it’s never long before a disliked feature is improved, replaced, or modified.

Recently, much of Google’s focus has been on improving experiences for mobile users—and by that, I’m referring both to users relying on mobile devices like smartphones, and users who need fast, on-the-go information. Event search is the latest new mobile feature, rolled out by Google earlier this month, and it’s worth considering both as a new SEO strategy and as a signal for what’s coming next.

How Event Search Will Work

In the Google search app, event searches take over when Google detects that a user is looking for an event. For example, the basic “events near me” triggers the event search, but specific queries, like “jazz concerts” also bring up relevant results. Rather than seeing a conventional search engine results page (SERP) layout, users then see a list of events relevant to their query, with the option to filter or reorder results based on a specific date, or by qualifiers like “today,” “tomorrow,” and “next week.”

After clicking on an event in the list, Google will display information for how to attend, such as linking a user to a ticket purchasing app or showing an RSVP option.

Supported Sites (and How to Get Involved)

So where is Google getting the information for these events, and how can you be a part of it?

Before launch, Google worked with a number of event sites to coordinate correct markup and listings for each respective enterprise. At launch, event search was displaying results from Meetup, Yext, Vividseats, Eventbrite, Ticketmaster, SeatGeek, Jambase, LiveNation, Bookmyshow.com, StubHub, Bandsintown, Eventful, and a handful of others. It plans to add support for even more ticket and event apps in the next several weeks and months.

However, you don’t have to wait for Google to reach out to you to make sure your organization’s events are listed in event search results. In fact, all you have to do to see your event listed is mark it up using standard Schema markup protocols—with a few new rules. Google has a handy guide for developers looking to mark up their site’s events, and it’s simple to follow. You’ll need to properly categorize your event, include all the specific information Google requests, create a unique URL for your event, and be careful not to mislabel an event (especially if it takes place over multiple days).

Marking up your events feeds that information to Google, so it can consider those events for relevant searches made by its users. Depending on how your site is currently set up and what types of events you host, it shouldn’t take long to make the change.

The Increasing Shift to Here and Now

One of the most important takeaways from this change by Google is that it marks another step in the search world to favoring the here and now. Mobile devices sparked a new revolution in moment- and place-focused optimization, and Google keeps pushing for better features. For example, Google overhauled the design and functionality of its local search results to favor users with mobile devices searching for immediate needs while they’re on the go. It has also introduced accelerated mobile pages (AMPs), which are designed to load as quickly as possible for mobile users who need fast information.

The rise in popularity of live video and in-the-moment social media updates also demonstrates users’ interest in seeing more content that’s relevant to their immediate interests. By coordinating content based on proximity to users’ location and proximity to present time, Google is moving forward in new dimensions of “relevance,” and users are demanding more instantly gratifying results.

Is It Worth Optimizing For?

Ignoring the paradigm shift toward immediate gratification for a moment, let’s consider whether it’s “worth it” for businesses to optimize for Google event search. If your business or organization coordinates most of its events through EventBrite, Ticketmaster, or similar sites, your events are likely already optimized for search. You won’t have to change anything to get your events listed. However, if you list your events mostly through your own site, and you host events regularly, it’s imperative that you adopt the latest markup standards so that people can easily find your events when searching for them.

However, even if your organization doesn’t host many events, it’s still in your best interest to mark up your event data whenever it comes up. Employing Schema microformatting is a best practice for all sites, as it makes it possible for your content to be featured in a rich snippet, and properly “understood” by Google’s search crawlers.

Looking to the Future

As you consider how to update your SEO strategy from here, make sure you consider the rise in importance of “here and now” content. Showcasing local events, getting involved in the community, and catering to users’ immediate needs with content and resources is, in my mind, one of the best ways to future-proof your SEO strategy.

It’s likely that Google will continue releasing new features that cater to demanding mobile users over the next several years, so make sure your business stays ahead of the curve.

The Sorry Legacy of Internet Explorer

AUTHOR: KLINT FINLEY.

SOON will be a thing of the past. Starting today, Microsoft will stop supporting Internet Explorer versions 7, 8, 9 and 10 on most operating systems, its biggest step yet toward phasing out one of the most contentious pieces of software ever written.

Microsoft has been distancing itself from the Internet Explorer brand since March, when it launched the Microsoft Edge browser, but it isn’t quite dead. Edge runs only on Windows 10, so Redmond will continue backing a few versions of Internet Explorer on older operating systems it still supports. But it’s still a big departure. Historically, Microsoft has kept several versions of Internet Explorer current each supported version of Windows. Starting today, it will support only the latest version of IE that an operating system can run. It will not create new security patches for the older versions, leaving anyone who doesn’t upgrade vulnerable to new hacks or attacks.

Thankfully, the time has come to move on.
That could be a huge hassle for organizations that use custom-built applications that run correctly only on older browsers. But it could be a boon to web developers and designers still trying to find ways to make websites good on older browsers. Newer web browser still have their quirks, and sites might look different from one browser to the next. But these differences are small compared to how Internet Explorer mangled web pages in the late 1990s and early 2000s.

By insisting on following its own path with IE rather than follow generally accepted standards, Microsoft dictated web design by years. That probably drove many aspiring web developers careers that didn’t require trying to figure out why the margins between images looked different from one browser to another. Keeping too many old browsers in circulation contributed to that mess. Thankfully, the time has come to move on.

The Bad Old Days
Because Internet Explorer didn’t stick to the guidelines established by World Wide Web Consortium the organization that establishes standards for web technologies, it often would display web pages in ways that made them look entirely different from other browsers, such as Netscape, Opera or, later, Firefox. Desperate designers cobbled together ways of making sites work across multiple browsers, but a complex layout sometimes required numerous workarounds. And Internet Explorer 6 was notorious for security vulnerabilities that Microsoft was sometimes slow to patch.

But if it was so bad, why was it so widely used? Most people blame Microsoft’s practice of pre-installing Internet Explorer with Windows starting in 1997, which contributed to a lengthy antitrust suit. Since many users didn’t know other browsers existed and PC vendors had bulk licensing agreements that prevented them from selling computers with alternates pre-installed, Microsoft effectively muscled out the competition.

‘There was a time when Microsoft made the best web browser in the world.’
DOUGLAS CROCKFORD
But that’s not the whole story. Microsoft still bundles Internet Explorer with Windows, yet by most measures it has fallen behind Google Chrome as the world’s most widely used browser. That’s in part because designers and developers have spent years encouraging users to download alternative browsers. But in the late 1990s, countless sites proudly displayed “best viewed on Internet Explorer” banners.

“People don’t remember this, particularly web developers, but there was a time when Microsoft made the best web browser in the world,” JavaScript expert and frequent Internet Explorer critic Douglas Crockford told InfoQ in 2010. “IE 6 was by far the best and continued to be the best browser in the world for many years after, but the other browser makers have all gotten ahead of them.”

That’s an exaggeration. Netscape 6 and Opera 5, both of which were excellent, arrived before Internet Explorer 6. But it’s true that Internet Explorer was ahead of the curve for a few years. Netscape users had to wait three years between the release of Netscape Navigator 4 in 1997 and Netscape Navigator 6 in 2000 (the company ended up skipping Navigator 5 in order to completely rewrite the software). Meanwhile, though Internet Explorer wasn’t very standards compliant, it was quick to add new features in the late 1990s. Developers who wanted to take advantage of cutting edge design and interactivity features had little choice but to use Internet Explorer and encourage their users to do so as well.

But by the time Mozilla, an organization started by former Netscape employees, released the first version of Firefox in 2004, it was Internet Explorer’s turn to seem hopelessly outdated.

Long Hard Road Out of Hell
When Internet Explorer 7 finally arrived in 2006, it was better than its predecessor, but still not standards compliant, so designers kept jumping through hoops to have pages render correctly. Not until Internet Explorer 8 landed in 2009 did Microsoft offer a browser that passed standards test Acid2, a widely used measure of how well browsers complied with the standards of the day, and the company lagged in adopting other standards, such as the 3D graphics technology WebGL. By the time Microsoft caught up to the rest of the browser market, the damage to Internet Explorer’s reputation had already been done.

But the biggest problem for Microsoft was that Internet Explorer 6 refused to die. Large organizations that spent vast sums building custom applications that worked only on older versions of Internet Explorer refused to upgrade. Many consumers didn’t know any better, or ran pirated copies of Windows and couldn’t download updates. As a result, Microsoft continued supporting Internet Explorer 6 until April 8, 2014, more than a decade after its release.

To keep that from happening again, Microsoft won’t update anything older than Internet Explorer 9 on Windows Vista and Windows Server 2008, Internet Explorer 10 on Windows Server 2012, and Internet Explorer 11 on Windows 7, Windows 8, and those versions of Windows Server that can run it. The move likely will expose outdated browsers to more security risks. But in the long run it will drive adoption of newer, better browsers.

With most of the old versions of Internet Explorer dead and buried, Microsoft hopes it can finally move beyond the sorry legacy of its early versions. Edge is a fresh start, with a new name, a new code base and a new boss. Microsoft can’t undo the the damage it did, but it can end the madness.

World reels from massive cyberattack that hit nearly 100 countries

by Jethro Mullen, Samuel Burke and Selena Larson @CNNMoney

Organizations around the world were digging out Saturday from what experts are calling one of the biggest cyberattacks ever.
Hospitals, major companies and government offices were hit by a virus that seeks to seize control of computers until the victims pay a ransom.
Cybersecurity firm Avast said it had identified more than 75,000 ransomware attacks in 99 countries on Friday, making it one of the broadest and most damaging cyberattacks in history.
Avast said the majority of the attacks targeted Russia, Ukraine and Taiwan. But U.K. hospitals, Chinese universities and global firms like Fedex (FDX) also reported they had come under assault.
Security experts said the spread of the ransomware had been stopped late Friday. But it remained unclear how many organizations had already lost control of their data to the malicious software — and researchers warned that copycat attacks could follow.
Europol said Saturday that the attack was of an “unprecedented level and requires international investigation.” And the U.K. government called an emergency meeting over the crisis.
U.S. Treasury Secretary Steven Mnuchin, at a meeting of world leaders in Italy, said the attack was a reminder of the importance of cybersecurity. “It’s a big priority of mine that we protect the financial infrastructure,” he said.
The ransomware, called WannaCry, locks down all the files on an infected computer and asks the computer’s administrator to pay in order to regain control of them. The exploit was leaked last month as part of a trove of NSA spy tools.
The ransomware is spread by taking advantage of a Windows vulnerability that Microsoft (MSFT, Tech30) released a security patch for in March. But computers and networks that hadn’t updated their systems were still at risk.
In the wake of the attack, Microsoft said it had taken the “highly unusual step” of releasing a patch for computers running older operating systems including Windows XP, Windows 8 and Windows Server 2003.
But the patches won’t do any good for machines that have already been hit.
“Affected machines have six hours to pay up and every few hours the ransom goes up,” said Kurt Baumgartner, the principal security researcher at security firm Kaspersky Lab. “Most folks that have paid up appear to have paid the initial $300 in the first few hours.”
Related: 5 things to know about the attack
Experts told CNNTech that an unidentified cyber security researcher accidentally stopped the spread of WannaCry by registering a domain name contained in the ransomware’s code.
The researcher, who uses the Twitter handle @malwaretechblog, told CNNTech they registered the domain name in order to study the virus, but it turned out the ransomware needed it to remain unregistered to keep spreading.
However, a hacker could change the code to remove the domain name and try the ransomware attack again.
And WannaCry has already caused massive disruption around the globe.
Sixteen National Health Service organizations in the UK were hit, and some of those hospitals canceled outpatient appointments and told people to avoid emergency departments if possible. The NHS said in a statement on Saturday that there was no evidence that patient information had been compromised.
In China, the internet security company Qihoo360 issued a “red alert” saying that a large number of colleges and students in the country had been affected by the ransomware, which is also referred to as WannaCrypt. State media reported that digital payment systems at PetroChina gas stations were offline, forcing customers to pay cash.
Related: NSA’s powerful Windows hacking tools leaked online
“Global internet security has reached a moment of emergency,” Qihoo360 warned.
Major global companies said they also came under attack.
Fedex said Friday it was “experiencing interference with some of our Windows-based systems caused by malware” and was trying to fix the problems as quickly as possible. Two big telecom companies, Telefónica (TEF) of Spain and Megafon of Russia, were also hit.
“This is turning into the biggest cybersecurity incident I’ve ever seen,” U.K.-based security architect Kevin Beaumont said.
How a ransomware attack can affect emergency services


Russia’s Interior Ministry released a statement Friday acknowledging a ransomware attack on its computers, adding that less than 1% of computers were affected, and that the virus was now “localized” and being destroyed.
The U.S. Department of Homeland Security, in a statement late Friday, encouraged people to update their operating systems. “We are actively sharing information related to this event and stand ready to lend technical support and assistance as needed to our partners, both in the United States and internationally,” the department said.
Related: How leaked NSA spy tools created a hacking free-for-all
According to Matthew Hickey, founder of the security firm Hacker House, the attack is not surprising, and it shows many organizations do not apply updates in a timely fashion.
When CNNTech first reported the Microsoft vulnerabilities leaked in April, Hickey said they were the “most damaging” he’d seen in several years, and warned that businesses would be most at risk.
Consumers who have up-to-date software are protected from this ransomware. Here’s how to turn automatic updates on.
It’s not the first time hackers have used the leaked NSA tools to infect computers. Soon after the leak, hackers infected thousands of vulnerable machines with a backdoor called DOUBLEPULSAR.
— Donna Borak, Samuel Burke, Mariano Castillo, Jessica King, Yuli Yang, Steven Jiang, Clare Sebastian and Livvy Doherty contributed to this report.
CNNMoney (Hong Kong)
First published May 13, 2017: 9:57 AM ET

Next steps toward more connection security

Chromium Blog

News and developments from the open source browser project

Thursday, April 27, 2017

In January, we began our quest to improve how Chrome communicates the connection security of HTTP pages. Chrome now marks HTTP pages as “Not secure” if they have password or credit card fields. Beginning in October 2017, Chrome will show the “Not secure” warning in two additional situations: when users enter data on an HTTP page, and on all HTTP pages visited in Incognito modeFix your website now…

http not secure

Treatment of HTTP pages in Chrome 62

Our plan to label HTTP sites as non-secure is taking place in gradual steps, based on increasingly broad criteria. Since the change in Chrome 56, there has been a 23% reduction in the fraction of navigations to HTTP pages with password or credit card forms on desktop, and we’re ready to take the next steps.

Passwords and credit cards are not the only types of data that should be private. Any type of data that users type into websites should not be accessible to others on the network, so starting in version 62 Chrome will show the “Not secure” warning when users type data into HTTP sites.

 

non secure in incognito mode

Treatment of HTTP pages with user-entered data in Chrome 62

When users browse Chrome with Incognito mode, they likely have increased expectations of privacy. However, HTTP browsing is not private to others on the network, so in version 62 Chrome will also warn users when visiting an HTTP page in Incognito mode.

Eventually, we plan to show the “Not secure” warning for all HTTP pages, even outside Incognito mode. We will publish updates as we approach future releases, but don’t wait to get started moving to HTTPS! HTTPS is easier and cheaper than ever before, and it enables both the best performance the web offers and powerful new features that are too sensitive for HTTP. Check out our set-up guides to get started.

Posted by Emily Schechter, Chrome Security Team

Spearhead Multimedia gas very inexpensive solutions to make your site secure.  Find out here…