This week Google announced that it is introducing a new facet to the way it displays search results with an official blog post outlining the arrival and potential impact of so-called ‘structured snippets’, according to Econsultancy.

Data contained within web pages which relate to a product, person or particular project can be extracted algorithmically and displayed below the link as an accompaniment to existing meta descriptions. Although there will doubtlessly be inaccuracies and errors that occur as this feature is rolled out more widely across its international SERPs, it is yet another change that SEOs and webmasters will have to take on board.

In short, Google has harnessed machine learning to work out which tables embedded in web pages are relevant to a particular search query and which have more to do with the way a site is laid out onscreen. This means it does not have to individually alter results so that they display pertinent information for users but instead can automate this process.

In fact, Google is even going to be ranking the facts it finds on their quality, accuracy and popularity, which means that sites may start jockeying for position to get tabular data onto pages in order to ensure that they appear more prominently on SERPs.

Whether or not this will actually improve click-through rates is another matter because like many of Google’s search alterations it seems that the arrival of structured snippets could be a double edged sword.

While it certainly means that some highly ranked results will be given even more exposure, there is still the chance that users will get all the information they need from Google’s SERPs without actually visiting any of the sites where the data truly resides.

Google has been gradually attempting to add more facts and figures to search results for some time, with things like the Knowledge Graph making a real impact on how data is relayed to users when they start their search. Some fear that this could change habits to the point that people do not even bother to click through to sites that host content that appears on Google’s SERPs, which could obviously have an adverse impact on traffic.

Of course, one reason to remain positive is that Google itself admits that the accuracy of the information provided by this service will be variable, as is inevitable when relying on algorithms to determine the relevancy of information in this way. So with such an imperfect system in place, people will still need to click through to sites to get the lowdown on products and services.

In addition businesses which rely on sites to relay information to customers about their real world activities, such as opening times, may benefit from the fact that this data may be delivered more directly and quickly to users when they make a search. So as with all of the changes made to Google’s search, it is a case of waiting to see the extent of the impact.