In today’s digital age, the biggest differentiator for success is often the customer experience. Many customers are no longer loyal to a company based on brand image, product offerings or even price. In the end, how the consumer reflects on their interactions could be the biggest factor that influences whether they come back or not.
Companies are expected to offer a personalised service, via multiple channels, whenever and wherever customers want it. On top of this, there is constant pressure to deliver and one bad experience is enough to drive a customer away and into the arms of the competition.
With the stakes being this high, it’s no surprise that the issue of customer experience is now a major talking point in many boardrooms across the world. Senior decision makers are considering the various processes they can put in place to improve the quality of their customer service and how they can use that improved experience to turn customers into advocates. Here are three ways artificial intelligence is changing the way businesses and customers interact:1. Dealing with the data deluge
One of the biggest challenges companies are facing is adapting to the digital nature of today’s customer journey. For all the benefits the digital revolution has brought, businesses are inundated with channels on which customers want to communicate with them, as well as the ensuing data. As of 2016, the US alone was creating 2,657,700 gigabytes of Internet data every minute. Although these numbers can be exciting for data enthusiasts, companies often find themselves sitting on a wealth of information they just aren’t capable of processing and therefore cannot use it to improve customer service. This is where artificial intelligence (AI) comes in to play.
AI has the ability to effectively process and analyse the vast amounts of data that companies gather, at a far quicker pace than humans ever can. This then allows businesses to recognise current pain points and react to them in real time, anticipate future problems, and learn more about their customer’s preferences helping them not only deliver a superior customer experience, but also a higher quality service overall. For example, if a company can identify that a customer is continually abandoning an experience at a particular part of the journey or if a high percentage of customers seem to be asking the same questions over and over, they can quickly fix issues and optimize around gaps of information. This helps ensure the experience is continually evolving to meet the changing needs of customers and helping businesses serve their customers better and address issues quicker and more effectively than ever before.2. Empowering customers and agents
As the adoption of AI in customer experience continues to increase, customer interactions will shift from being heavily dependent on the human agent, to a hybrid model consisting of the human element, with technology helping share the workload. The rise in AI-powered chatbots means customers are empowered to self-serve where appropriate. They will no longer need to wait in long queues for an employee to help them out. But that by no means should translate to massive job loss for human agents as some people are predicting.
Instead AI should be leveraged to augment the human agent – helping offload simple, repetitive tasks leaving them open to more creative, meaningful work. AI-powered chatbots are smart enough to know when they aren’t smart enough to solve a problem and need their human “co-worker’s” help. Technology has made it possible to seamlessly bring the agent into the conversation to continue the interaction right where the chatbot left off. And it’s a symbiotic relationships where AI is also helping agents on the backend provide a better, more personalised experience, by quickly feeding customer history and preferences as well as recommendations to the agent during the conversation .
In the end, AI is only as good as the people working with it. Simply put, AI affords companies the ability to better leverage their most valuable resources – their employees – not replace them.3. Creating a more consistent experience
With the rise in channels that customers use to interact with their favourite brands, be it calling a customer service centre, sending an email, a mobile app, live chatting on the website, or connecting via social media, businesses need deliver the same level of service across each and every channel. A Facebook messenger interaction, for example, needs to provide a consistent message and experience as an agent in a call centre.
Unfortunately, many organisations are still operating on legacy solutions with bolted on channels that are made up of disparate systems – built on widely different knowledge bases. This results in a very disconnected experience, significant time delays, and the inability to seamlessly hand off an interaction from one to another – leaving customers unhappy, frustrated and more likely to take their business elsewhere.
AI-powered customer engagement platforms are helping to close the information gap between customers and businesses, by connecting these channels. This is allowing bots and agents alike to get a full view of the customer history so all interactions can be informed, actionable and personalised. By having this 360-degree view of the customer journey, companies are in a better position to support customers when, where and how they need it the most.
For example, an agent may pick up the phone to a customer, and before the conversation even begins, the agent will know that the frustrated customer sent a direct message on Twitter outlining their problem and hasn’t received a response yet. Instead of having that customer rehash the issue the agent can get right down to the business of solving the problem.
Brands know that they need to be thinking about customer experience, but companies who aren’t acting on it will start to feel the full effects as unhappy customers move to the competitions. Businesses have the opportunity to put customers above all else and distinguish themselves through better, more personalised customer engagements. AI is making that possible.
Ryan Lester, Director Customer Engagement Technologies at LogMeIn
Image Credit: Jirsak / Shutterstock
Blockchain technologies and cryptocurrencies have proven to be successful mechanisms for managing payment transactions over the last several years. Currencies such as Bitcoin, Ethereum and many others have enjoyed exponential growth in popular interest and adoption, while blockchain technology has been been utilised in many applications. Indeed, many have likened cryptocurrencies to the early internet, citing its enormous potential to disrupt payment systems in the same way the internet disrupted information accessibility.
However, online payment solutions in general, and cryptocurrencies in particular, lack the ability to scale in volume and speed. Several blockchain-based technologies have been created to tackle the challenges posed by providing high transaction throughput while remaining inexpensive, but these have been met with little success. Another challenge faced is the lack of trust between unknown parties, which leads to countless chargebacks and transaction cancellations. Moreover, merchants are often classified as ‘high-risk’ or ‘low-risk’ based on their association with a particular industry, rather than their demonstrated behaviour.
In the online payments sphere, a handful of companies have attempted to create new protocols to overcome the challenges of scalability and ease of use in blockchain technologies. Nevertheless, they have proven incompatible for payment use cases—at least not at the level of scale and trust that is demanded of the industry. As it stands, linear block confirmation protocols have not been able to provide feasible solutions for the recognized shortcomings of blockchains.What is a DAG blockchain?
Taking off from blockchain-based ledgers, a number of companies have harnessed the potential of directed acyclic graphs (DAGs) to facilitate a unique way of solving the scalability challenge. While a handful of tech companies have adapted varying approaches to the DAG, each node is generally depicted as a transaction in time and must attach to prior transactions in order to be confirmed. DAGs squarely address the biggest challenge of blockchains by providing an efficient workaround to scalability limitations.
Blockchain limitations are solved by using DAG as the base of their network. This allows to scale infinitely and at negligible cost. Transactions will be approved in the blink of an eye. There are currently only a handfull of currencies that use the next generation DAG architecture.How can DAG models improve blockchain technologies, which are already so preeminent all around the world?
The blockchain industry has provided several mechanisms to achieve consensus. Proof-of-work (PoW) was one such method introduced by Bitcoin’s famous Satoshi Nakamoto. In the payments realm, however, it is quite clear that there is a counterproductive incentive structure existing between the level of miner compensation and the amount of users transacting in a system with inherently low scalability. Add to this the enormous electricity waste of PoW systems and increasing mining farm monopolies. Proof-of-stake mechanisms (PoS) emerged to address such downfalls, but confirmation speeds have remained low. While Bitcoin can confirm 3-5 transactions per second (TPS), Ethereum’s maximum is 20 TPS. To this effect, both PoW and PoS models have not been able to meet the needs of consumers who have grown accustomed to efficient Visa and Mastercard confirmation times, which can reach 65,0000 TPS.
Creating trust between two disparate parties is also a well-known challenge. For the sub-industry of anonymous payments, the issue is irrelevant, but for the layer of merchant-consumer, consumer-consumer, and acquirer-to-issuer transactions — the crypto world should be poised to provide a solution.
Distributed ledgers based on a DAG data structure, uses a mathematical model on a base protocol, consisting of transactions that propagate unidirectionally. The flow of new transactions can been modeled using the Poisson process. Trust Scores are the key mechanism by which new, unconfirmed transactions select prior transactions to validate in order to reach faster transaction confirmation consensus. Each new transaction must validate two previous transactions with a similar Trust Score threshold, or points that have no inbound transactions in the DAG. It then validates those sources and becomes a source itself. To understand how this is done, it is important to note that a user’s private key is associated with his or her public key. While any person can sign the bundle hash, only the owner of the private key can sign it in such a way that the result of the verification with the public key is the same as the original bundle hash of the transaction. This is how a transaction’s signature is checked against the sender’s public key.
In a DAG system, an algorithm enables a new transaction to be verified by two previous transactions before it is added to a node. All nodes contain a series of transactions and are acyclic in nature, meaning any given transaction cannot be encountered a second time on another node. This eliminates the problem of double spending while foregoing the need for miners and stakes. What’s more, all transactions are immutable as all nodes flow in a specific direction and cannot be traversed on an opposite trajectory, meaning A → B is not B ← A. Because miners are eliminated in a DAG, processing times will be cut drastically, as well as fees. The ledger is therefore organised as a DAG, where the vertices represent transactions and directed edges extend from each transaction to two others that it validates.
To reach transaction consensus, we need to find the heaviest cumulative chain. If the cumulative chain has surpassed a predetermined threshold, then the transaction will have reached a ‘trusted’ consensus and can be confirmed. Those with higher Trust Scores are incentivised with optimised transaction confirmations, as their chain can reach the required cumulative Trust Score threshold faster. The implicit nature of the DAG structure and validation process enables the protocol to reach faster consensus — achieving a transaction confirmation rate of 10,000 TPS.Could DAG blockchains become Blockchain 3.0?
The short answer here is, yes. The potential is there. However, it will depend on real world use cases and the ability to implement practical scalability. DAG-based projects, such as COTI, IOTA and Hashagraph, are decidedly gaining speed, as they are tailor-made for projects with a high-demand for scalability, such as the online payments industry and Internet of Things (IoT). While greater scale in blockchain-based networks leads to adverse effects on network usability, in DAG-based networks greater network usage leads to improved network scalability. As such, there is a positive correlation between the number of network users and the rate at which transactions are confirmed. This makes the DAG ideally suited to achieve full decentralisation without compromising on scalability, instantaneity and low-to-zero fees.
It is important to note that DAG projects are far from being perfect. The use of smart contracts in conjunction with DAGs has yet to take off, although it would be positioned to solve scalability challenges. Moreover, one of the key challenges with the use of DAGs is their vulnerability to double spending attacks, in which a malicious party can attempt to spend the same amount of money twice across different locations in the network. This, of course, is a massive challenge as real money is at stake here. However, there are several projects working relentlessly to solve this issue without the cost of harming scalability and so we will certainly be hearing about DAG blockchain a lot more in the near future.
Eli Hallufgil, Research & Software Engineering lead for COTI
Image Credit: Zapp2Photo / Shutterstock
Whether you’re talking about football, cricket or rugby, the draw of team sports is unmistakable. Around the world, fans tune in by the millions to root for their favourite teams, watch stunning feats of athleticism and experience the thrill of victory – or the agony of defeat. In each of these sports, a match is a series of challenges and obstacles. Both offense and defence require a solid plan and the ability to quickly adapt when that plan fails. Beating the other team requires quick thinking, agility, stamina, strength and (quite often) sheer luck.
In today's relentless cybersecurity landscape, IT departments face a challenge that’s not unlike going up against a championship football team. The opponent is crafty, experienced and hungry for victory. However, this is no game and the consequences of defeat are very real. Fortunately, IT departments can prepare for these challenges in much the same way that sports teams prepare for a game: by developing detailed plans based on understanding the opponent – and being ready to adapt when those plans fail.
Outlined below are some of the most common hurdles that cybersecurity teams face, alongside tips for overcoming them. Once you’ve developed your game plan for each, practice each scenario. Simulate each scenario multiple times, taking notes about what worked well and what didn’t. Adjust your plan accordingly. Armed with this preparation, you will be much nimbler when unexpected attacks present themselves.Ransomware
Ransomware is a constantly evolving malware shape-shifter that poses a threat to all companies’ security. One the largest ransomware attacks in recent years was NotPetya. This ransomware strain targeted Windows-based systems, causing global panic amongst businesses. It worked by infecting the master boot record to execute a payload that encrypts a hard drive's file system table and prevents Windows from booting. It subsequently demanded that the user make a payment in Bitcoin in order to regain access to the system.
In the event of a ransomware attack, companies should not pay the ransom as there is no guarantee that companies will actually get their data back. Instead, they should plan ahead by implementing a top-tier recovery solution that allows end users to quickly and easily restore their own data. This not only completely mitigates the chance of data loss during a breach, but also goes a long way to maintaining the financial security of a company. This approach is also extremely useful when unexpected complications arise. No matter what clever tweaks the cybercriminals develop, users can simply revert their devices to a point before infection.Phishing / Whaling
While some enemies attack through brute force, others opt for deception. Phishing’s purpose is to manipulate employees into clicking malicious links that enable viruses to spread and ultimately cripple a business. Often the emails claim urgency and require immediate action. In doing so, they apply pressure on recipients to act rashly without identifying the scam.
A specific type of phishing scam focuses on prime targets, such as C-suite executives of big companies. This practice is more commonly known as “whaling.” Those in high-ranking positions are often more lucrative victims due to their authority within the organization. An infamous victim of whaling included Snapchat, where a senior employee was fooled by a spoof email into revealing sensitive financial information.
Many phishing attempts are poorly constructed and relatively transparent, but it only takes a momentary lapse in concentration for one to prove successful. Therefore, employees should be trained to distinguish legitimate emails from phishing baits, and to report suspicious links to the IT department. Like running a practice drill, setting up fake attacks can train employees to spot suspicious emails and help instill in individuals a sense of responsibility about their email behaviour.
It can be most challenging to detect those threats that come from inside. Insider threats can be committed by anyone within the company – either through malice or simply through human error. Either way, the collateral damage could be irreparable for a business. For many Chief Information Security Officers (CISOs), this means dedicating energy to internal vigilance, as well as keeping the bad guys out.
To protect a business from an insider threat, IT departments need to have full visibility of sensitive corporate information that exists within a company, where it is resides and how it moves. Once IT has a comprehensive view of the flow of data across the business, they can detect suspicious movements and exchanges of information. Lastly, implementing the correct data protection and visibility tools ensures an unrivalled defence system, both in alerting IT teams to a breach, and showing them how to quickly rectify it.GDPR
The biggest change in data protection legislation in the past two decades went into effect on May 25, 2018. The General Data Protection Regulation (GDPR) is raising the stakes for how businesses process and handle private information in the EU. Unfortunately, this isn’t just a challenge that will affect businesses located in Europe – the GDPR applies to every company that does business within the EU. Should an organization fail to sufficiently safeguard personal data against a breach or flag a breach to the supervisory authority within 72 hours, they will face a fine equivalent to €20 million or four percent of its global annual turnover, whichever is greater.
So, how can companies demonstrate compliance? Companies must implement security measures that match the level of risk to the personal data held. IT departments and systems must be safeguarded with adequate time and resources to ensure that data transfer and transmission can function without risk to security. This ranges from securing the enterprise perimeter with antivirus and malware protection, to swiftly detecting breaches and ensuring endpoint visibility. Why is the endpoint so important? Code42’s CTRL-Z study showed that more than 60 percent of corporate data is stored on user endpoints, so it’s vital not to leave them undefended. There is no “one size fits all” solution for adhering to the GDPR, but regular auditing of security solutions and InfoSec strategy to ensure an organization is constantly compliant is a must. After all, GDPR compliance is not a box that can be checked off and forgotten – it is a constant state that must be continually maintained, like an athlete staying in shape throughout the season.
The cybersecurity landscape continues to evolve in complexity and difficulty, with new obstacles cropping up nearly every day. Security professionals do not have to accept the inevitability of having their organizational security compromised or attacked, but they should plan for the worst case scenario and be ready for anything.
Richard Agnew, VP EMEA North at Code42
Image Credit: Bee Bright / Shutterstock
When it comes to cyber security, employees are often too laid back. It could be that they are unaware of the consequences of a risk data breach or a phishing scam, or perhaps they just don’t think they are responsible for it.
But this shouldn’t be the case and it’s a lack of education that makes employees one of the most vulnerable, with recent research revealing 28 per cent of workers re-use the same password across multiple accounts and 22 per cent have already been hacked.
So why are workers opting for such an unsafe route? And what can businesses be doing to encourage employees to be cyber-safe at work?Education
On the whole, employees are somewhat unaware of the risks that follow everyday tasks. We all read about the large-scale data breaches happening across the world – most recently the Yahoo UK data breach fine - but very few stop to understand how it happened or who is responsible.
To combat this, business owners or the IT department – depending on how big the organisation is – need to educate employees on the risks and the policy the company has in place to avoid them.
Training should be undertaken as a whole team to ensure everybody is on the same page and reiterate the policies and processes. Once complete, it is ultimately the employee’s responsibility to be more aware of cyber security issues and put the training into practice.
Looking at the most common and significant threats out there today, there are a few stand-outs that businesses and business owners need to be aware of to ensure they are protected.Unsecure devices
It is far easier for a cybercriminal to use an employee against a company than it is to devise unique ways to break into the IT systems.
The threat could come from anything, from connecting to a personal device in work, to taking company-connected devices home. Other scenarios would include connecting to work accounts through an unsecure network or device and accessing sensitive data or copying files to an unauthorized USB destination.
To avoid these cyber threats, the company should be empowering the team through the training sessions outlined above to remind them just how vital they are to the organisation. As a platinum status WatchGuard partner – one of only two in the UK – SysGroup not only offers firewalls, email security and other security solutions but also interactive security awareness training courses tailored the specific contexts and requirements. The aim of the training is to make employees more aware of data and security, while creating opportunities to make improvements.Sophisticated phishing
What may seem like one of the oldest tricks in the cyber-criminal handbook, sophisticated phishing is on the rise. These scams are becoming so common and advanced that it’s often difficult to discern between genuine and fake emails. In fact, in 2017, 91 per cent of attacks began with email phishing.
Phishing emails or messages prey on our instinctual responses to panic; they threaten loss of data or account details, hoping that you will click through immediately and give up your details.
To protect against these threats, employees should be encouraged to think twice and delete emails from suspicious or unknown addressed. That said, research tells us that even with training, 23 per cent of phishing emails are still opened with 12 per cent of those targeted clicking on the infecting link, so protecting the company against human error should be top priority.
At SysGroup we provide customers with email security solutions and technology from leading providers such as Mimecast and Kaspersky, to defend and protect against advanced threats and data loss. Nowadays, it’s more critical than ever that organisations of any size implement robust email security steps to avoid future mishaps.Passwords, passwords, passwords
Finally, relying on a simple username-password combination to access systems puts the whole organisation at risk of a data breach.
The software used to crack password combinations is so advanced nowadays that a simple single-word password or predictable pattern will be easily guessed in no time at all. Malicious password cracking software for example, can guess billions of options in seconds, so pet names or the place you were born are no match for these algorithms.
Even a mandatory password reset can be insufficient. Business leaders must work with the IT department to help employees pick stronger passwords and use multi-factor authentication as the new standard. Also consider investment into password management software – it can make a big difference to the organisation and can be an important step to lessen the chances of being hacked.
At SysGroup, we work with one of the world’s leading technologies to ensure our customers emails stay safe. The technology combines two of three possible authentication methods, so that any data leak that could compromise user names or passwords remains secure, without a unique third authentication factor.Planning ahead
What’s clear is that businesses and business owners should be prepared for the worst and look at all sources to combat the threats out there today. In a world where cybercriminals are becoming smarter and faster, a company can never be too secure.
Internal cyber security policies should be constantly re-evaluated and working with expert partners can help to create a robust strategic plan from the top-down to enlist the help of employees in better security practices.
What’s more, with GDPR legislation recently coming into effect, it is now even more important to prevent the human errors that lead to so many data breaches. Training days and even simulated phishing scams and other attacks will reveal how vulnerable the business may be and help to craft the vital strategy to improve systems and procedures.
Adam Binks, CEO of SysGroup
Image Credit: Andrea Danti / Shutterstock
The World Cup is well under way. It's one of the biggest sporting events in the world. Bigger than the Super Bowl! Bigger than the World Series! It has drawn more than 5 million in-person spectators to Russia and is already harvesting a worldwide TV and online audience in the billions.
Producing and running such an event requires hundreds of companies and thousands of people. Like the Olympics, it happens in a new location every four years, which means the infrastructure must be built from scratch each time, the complexity of which becoming more staggering with each fresh incarnation.
This year, the World Cup is possibly more primed than ever before for controversy. Awarded by FIFA to the Russian Federation amidst a flurry of bad blood and exacerbated by claims during the subsequent FIFA investigation that the Russian computers involved in the bid were “destroyed.”
Since this decision to award the tournament to Russia, media activity surrounding the country could have been better. From widespread condemnation of Russian activity in the Crimea and in Ukraine, to accusations of meddling in democratic process across the globe has left a less than positive outlook on the superpower. This opportunity to host the greatest show on earth is hotly anticipated as a way for Russia to remould its global brand. A chance for some positive publicity and a bit of redemption perhaps. The flipside of such an opportunity presents a plethora of potential issues, not least in an area Russia knows all too well – cyberspace.
When you consider the scope and scale of technology infrastructure required to host and broadcast one of the largest sporting events in the world, there is plenty of opportunity for malicious cyber activity. Given the level of investment and number of physical attendees and remote viewers, the World Cup presents not only a ripe target but a grand stage for a wide range of cyberattacks. Anything is possible, ranging from ticketing scams to malware and ransomware outbreaks to distributed denial of service (DDoS) attacks.
The good news is that constructing the digital infrastructure from scratch (to an extent) should provide the advantages of using new technology that attackers haven’t had the opportunity to research and compromise in advance. As most hacks are multi-phased, what has already begun to happen is the registration of look-a-like domains. For example while worldcup2018.com .net .etc is taken for almost all domain extensions, malicious groups are already registering domains such as - russia2018, russiaworldcup.com, likely for use in phishing emails to fans and even to staff as a campaign to socially engineer malware and backdoor access to gain control from the inside of the digital infrastructure.
As we saw during the Champions League final, it's also possible for a targeted malware campaign to exploit known vulnerabilities in thousands of IoT devices, working to generate a botnet army at the ready, to strike with a DDoS attack. If there is good news here, it’s that there is a very slim chance that a breach would actually bring the games to a halt.
Russia however, will know all of this is a distinct possibility. World Cup organisers have most likely brought in skilled experts early on in the planning stages to create both physical and digital threat models to understand and pre-empt the most likely attack scenarios. It’s important to start engaging with security intelligence experts to identify cyber security threats early and start building defences such as black lists of any potential attack vectors, as well as educating both ground staff and developers working on infrastructure about the security risks and how to best mitigate them.
As for attacks on the individuals involved, players’ social media accounts may be an easy target. It has happened in the past. Meanwhile, players have a focus for events like this and the security of their social media is often not even on their radar during that period. Additionally, ensuring passwords are managed properly and that best practices for securing personal information is something that is difficult to manage at the best of times.
For fans both at the event and at home, there is a potential for disruption. Any aspect of the digital infrastructure is fair-game. From self-printing ticket kiosks or connected QR code readers for eTickets, to malicious mobile applications posing as official World Cup apps, to the live streaming of the event are all possible areas for threat. Cyberattacks could even impact fans viewing the games from the comfort of their homes, even from the opposite side of the world. Fans will inevitably search for streaming sources outside of the official broadcast partners and that provides a big advantage to cyber criminals who want to spread malware or steal information.
Viewers should be wary of sites that ask you to install software to view a live stream. This is an outdated requirement for modern browsers and is likely an attempt scrape details from your web browsing activities including credentials for logins to social media, download malicious software, or even hi-jack your computing resources to mine crypto-currencies while allowing you to watch the game illegally.
Advice to fans, organizers, and anyone else involved in putting on this gargantuan party is simple: Treat your digital information and property like you would physical property. You lock your door and set the alarm before you leave the house, right? You probably don’t make it a habit to leave your wallet on display. You may even have a fire extinguisher and a smoke alarm if you’re particularly mindful. These activities are all normal to us. Applying the same level of pro-active security to cyber-threats is a long way from common knowledge, but protecting yourself from cyber threats starts with understanding the threats better and using basic things like password managers, email scanning software that alerts you to phishing emails, and being very wary of any software you allow to run on your phone that asks for permissions you aren't familiar with. Finally, keep your devices and software up to date. Attackers prey on low-hanging fruit—known vulnerabilities in widely used software or software components. If big corporations like Equifax can fall victim to using out-dated technology, so can individuals, and so can the world’s biggest sporting event.
Steve Giguere, Lead Sales Engineer at Synopsys
Image Credit: Stux / Pixabay
The Institute of Coding (IoC) has now officially been opened in the UK. The IoC’s goal is to nurture UK’s next generation of digital talent ‘at degree level and above’.
The IoC is comprised of the technology industry on one side, and the academia on the other. Currently, 25 academic institutions and 60 businesses are involved. Besides training, the IoC will look to make courses available at apprenticeship, degree and short course levels.
The total amount of investment in the project is £40 million, with £20m coming from the government, and the rest from partners.
One of the body’s assignments will be to anticipate future skill gaps in the UK workforce. That will be done through research, analysis and intelligence which will be spearheaded by Dr Rachid Hourizi.
Jacqueline de Rojas, president of Tech UK said: “With UK business leaders crying out for employees with the latest IT skills, the IoC is already helping companies to develop the technical capabilities of workers across the country.”
"Serving as a bridge between industry and academia, this organisation will enable companies to build workforces fit for the future, by offering opportunity to benefit from high quality learning to everyone.”
the UK continues being the leading European destination for international tech investors – with more than £5 billion in VC funding since 2016. Germany (£2.15bn), France (£1.55bn), and Sweden (£644m), combined – didn’t have that sum.
The biggest reason for this difference is the fact that London has a high concentration of world class tech companies. Just the UK’s capital alone has attracted more than £4 billion in investments, ahead of places like Paris (£1.14bn), Berlin (£814m) and Stockholm (£542m).
Among the biggest deals, the report singles out TransferWise (£211m), Monzo (£71m) and Revolut (£177m).
Image source: Shutterstock/everything possible
Panasonic has just announced a five-inch Android smartphone built for mobile workers. It's rugged, durable, and comes with a set of unique features, ones that the average consumer might find useless, but which will probably mean a lot for workers.
The smartphone, FZ-T1, comes with an auto-range barcode reader, which allows its users to manage inventory easier and faster. Field workers in repairs and maintenance, navigation, proof-of-serivce, document capture and real-time inventory checking, those are just some of the use cases for the barcode reader.
The device is dust and water resistant and can survive a fall from 1.5m of height. It can also work on temperatures ranging from -10 to +50 degrees. It's 240g light, and has a swappable battery that can last up to 12 hours.
There are two models available, one with both 4G and Wi-Fi connectivity, and one with just Wi-Fi.
The device is compliant with Panasonic COMPASS (Complete Android Security and Services) 2.0. The company claims this offers businesses everything they need to securely deploy and manage their Panasonic rugged device.
“This latest Panasonic Toughbook handheld is an important addition to our growing rugged Android portfolio ,” said Jan Kaempfer, General Manager of Marketing for Panasonic Computer Product Solutions. “It is one of the most versatile, stylish and affordable Android rugged handhelds for business and is underpinned by our market leading reputation for durable design and flexible functionality.”
Image Credit: Nito / Shutterstock
If senior executives in the UK are to be believed, the young workforce is the main culprit for cybersecurity incidents.
A new report by Centrify polling 1,000 workers aged 18 – 24, as well as 500 decision makers in the UK, the report says more than a third of execs are pointing their fingers towards the young.
Younger workers, on the other hand, are doing very little to dispel this negative image. More than a third can access all files on a network (instead of only those necessary for work), a fifth needs to request access, and 43 per cent have said to have access to only those files they need.
The biggest problem is password sharing. More than half (56 per cent) of senior execs worry about this (mal)practice. More than a quarter (29 per cent) change the passwords at their own free will, and 15 per cent have shared them with their colleagues.
Besides sharing passwords, seniors also worry what the young ones might post on their social media profiles, and if such posts could damage the company image. On the other hand, a fifth of workers doesn't care how their posting affects the company. Eighteen per cent have said to have posted things that could compromise employers' security and privacy policies.
“Some may think of younger workers as always online, always ready to share information and perhaps not being as concerned about privacy or security as older workers, but we must remember they are the business leaders of tomorrow and we must help not hinder them,” comments Barry Scott, CTO EMEA, Centrify.
“While it’s clear that employers are concerned about this new generation entering the workforce – and see them as a potential risk to both the business and brand – these same companies are perhaps guilty of not putting in place the right security processes, policies and technologies. If you give employees access to any information at any time from any place, or fail to enforce strict password and security policies, they are likely to take full advantage, putting both their own jobs at risk as well as the company itself."
Image source: Shutterstock/deepadesigns
Unequivocally, the best way to ensure successful adoption of the modern workplace is to drive home its intrinsic value to your customer. It all depends on where they are in their cloud roadmap. Your customer may not feel comfortable or ready to migrate all their business data into the cloud at once. Based on our experience as one of the UK's top Cloud Distributors, we repeatedly hear from our customers that, when speaking to SMEs, one of the most popular options they choose is a hybrid model. Based on that experience, if planned and executed appropriately, resolving entrenched business issues is an essential part of a successful cloud migration and workplace transformation.
Here are four steps to improve your chances of success:1. Identify the target platform best suited for the organisation
Numerous factors weigh into the decision to select either Office 365, G Suite or another productivity suite entirely. Among them are the desire to continue using productivity applications on the desktop, ensuring files are stored securely in the cloud, and that access to email is easy, flexible and able to be scaled.
Office 365's dominance in the cloud productivity space - now more than 120 million business users worldwide - is driven largely due to the ubiquity of Microsoft Office applications and people's familiarity of using Excel, Word and PowerPoint in business. Despite modern file transferring tools such as Dropbox, WeTransfer, OneDrive and Acronis Files Cloud's disruption in the market, many people continue to prefer to send each other files over email, created locally with Microsoft apps, as that's what feels familiar and safe. Choosing Office 365 allows teams to hit the ground running by working collaboratively online with easy editing options. Plus, it also offers more advanced virus protection and rights management functionalities.
Small businesses that elect G Suite favour its born-in-the-cloud, basic personal and team productivity tools. It's particularly favoured by small start-ups who have no need to transition their business into the cloud, since they've been part of a cloud network from day one. If their business grows at an extraordinary pace, however, they may feel the need to transition to the Office 365 productivity suite, which promotes generous file storage and email storage packages compared to the G Suite entry level plan.2. Chart your migration path carefully
It's widely understood that when migrating to the cloud, the more intricate the cloud architecture is, the less likely the migration will be successful. To minimise complexity and the risk of compromising data during a migration, consider what your customer wants, how much time and resources you'll need to dedicate to the project, and where and when you'll implement the migration to mitigate customer impact.
Let's circle back to the customer's roadmap. Migrating to Office 365 can be relatively painless, depending on what email solution they've used previously. There are a number of options you could take, including but not limited to:
- Great for organisations with fewer than 2,000 mailboxes
- Used with on-premises Exchange Servers to Office 365
- Maintains the mailboxes during migration without any additional steps
- No more than 150 mailboxes should be migrated at a time
- Used when migrating from a non-Exchange system such as Webmail or G Suite
- Manually create the Exchange mailboxes in Office 365 by using a CSV file containing; email address, username and password for each individual user
- You can only migrate items in a user's inbox - no calendars, contacts or tasks
- You can only migrate 500,000 items max from a user's mailbox (newest to oldest)
Use a third-party email migration tool
- Many of our customers utilise the Office 365 Marketplace to find migration tools that will suit their customer
- Popular choices include; CloudMigrator and SkyKick
It's important to note that when choosing your customer's migration path, agility and nominal cost-efficiency aren't enough. The best way forward is to select a cloud resource that is intimately familiar with a particular configuration and offers applicable migration support services.3. Develop a single system of record for teams to share contacts and conversations
For Microsoft customers that have legacy CRMs, we generally recommend migrating to Microsoft Dynamics 365. It combines CRM and ERP into one solution and is a step in the right direction for companies that want to use ERP in a limited capacity or as an integrated business management solution that includes Office 365, Outlook, and Power BI for reporting.
However, Dynamics 365 is not a self-starter CRM kit for businesses. In cases where Dynamics feels too complex or expensive, especially to businesses that conduct their contact relationships via email and retain customer information on spreadsheets, we recommend Nimble CRM. It's a social CRM that runs inside Office 365, Outlook or G-Suite email, contacts and calendars, social media platforms and mobile applications. Nimble unifies contact data and engagement history from disparate systems into a single system of record with socially enriched profiles that are easily accessible across team members, channels and platforms.4. Facilitate end user adoption
Ultimately, the success of a cloud migration isn't in the promised technical functionality, it's how the organisation reacts and adapts to it. ITCs that address end user pain points, in addition to providing help desk support with a fully managed service, stand the best chance of enabling a successful workplace transformation.
Nimble's 80% end user adoption rate attests to the intrinsic value business teams see in fixing deep-rooted contact management issues. Without problem-solving solutions like Nimble, businesses could be stunting their growth. On average, sales people spend a fifth (19%) of their day researching data and insights, according to Cirrus Insight. That percentage translates into an astounding 49 days per year, or £4,029 per salesperson, per year (based on the average salary). This time is better spent in other ways; customer engagement, cross-selling, upselling and so on.
At the very core, using cloud solutions is all about making things easier for the end user. When they can reach everything in one place, with one set of login credentials and integrate their productivity suite with other applications like Nimble, it gives businesses the room they need to expand, grow and scale at large.
Mike Wardell, CEO of Giacom
Image Credit: Microsoft
Barclaycard has announced its first co-branded business trade card in the UK.
The card, which is built for this specific industry, offers an extended interest-free period of up to 116 days for payments made at Travis Perkins and Toolstation.
The perk is that the transaction must be made on the co-branded card, while the extended period applies when the customer pays their statement balance, apart from the extended interest free balance, in full and on time. The interest-free period of up to 56 days’ applies to any purchase when the customer pays their statement balance in full and on time.
Barclaycard also says a fifth of small businesses see cash flow as one of the biggest everyday obstacles to business growth. It bases this conclusion on an online survey of 1,001 UK SME decision-makers, conducted in fall last year.
Working capital is a bigger concern than overhead costs and worries around customer contracts.
“We’re excited to launch this partnership with Travis Perkins and Toolstation, two of the biggest names in the industry. While cash flow remains one of the main challenges for any growing business, we know that the construction industry in particular is comprised of many sole traders and small teams that frequently make up-front investments before they themselves get paid," said Ian Reid, director of small business at Barclaycard.
We hope our extended interest-free offer will be another way to help them take control of their cash flow.”
Cisco Live is a yearly gathering of Cisco customers, partners, and employees, eager to experience the latest Cisco technologies. Cisco engineers from throughout the company converge to create a pop-up city that enables content sharing, interactive labs, exciting demos and live streaming.
The show is powered by a dynamic network, the same network that enables businesses to run and thrive in this digital world. The team of individuals responsible for managing the network, which has grown every year as the size and complexity has grown, ranges from early-in-career associate systems engineers to experienced CCIEs (Cisco Certified Internetwork Experts), technical leaders and distinguished engineers. Many of these people return every year, bringing a richer set of experiences and feedback for improvements. Let’s explore the year-over-year evolution of the network engineering team, examining how the type of work has changed at each level.Humble beginnings
When the team first began building out the network for Cisco Live in Europe, the Network Operations Centre (NOC) was a relatively small team and operated in a heavily siloed manner. Not because they hadn’t worked together before, but rather because the different tasks involved in the build-out didn’t overlap. The entire group consisted of routing, switching, wireless, access, data centre and network management teams. Teams knew their part of the network and did what they thought was best to accomplish their goals. There was no centralised automation, which meant a lot of configuration by hand and per device. Switches were unboxed, powered on, and configured via copy and paste. Complex Quality of Service (QoS) configuration on edge routers was accomplished by hand or via copy and paste
This manual configuration approach brought challenges that were compounded by the fact that the Cisco Live network was very dynamic, often needing last-minute changes either driven by stakeholder needs or venue oddities. When there was a problem, it often meant breaking out a console cable and running across the venue to troubleshoot. Network visibility was limited to the tools that were on hand. Because of the lack of strong collaboration, sometimes work was duplicated. The team was very reactive and, because there wasn't pervasive network visibility or strong holistic automation, this led to long hours.
In 2015, we needed to redesign the data centre for Cisco Live Europe. I led this project. If I had to add a new VLAN, it meant manually configuring four switches, four UCS fabrics, and 16 compute hosts. Carefully applying the necessary commands on all of the devices took a good 20 minutes – while the network was running. I didn’t want to ever do that again. So, since I had been spreading the message of automation at Cisco Live in various breakout sessions, it was time to practice what I preached. I began to build scripts that would automate the whole process.Introducing automation
With this goal in mind, I developed a set of Python and shell scripts that would automate the process of configuration. Parameters required to create an end-to-end VLAN were pushed to each device and were automatically validated. These scripts didn’t simply automate the command-line interface, they also enabled the data model-driven APIs embedded in the devices. This meant the configuration was applied more quickly and more reliably. At the 2016 Cisco Live, none of the configuration had to be applied by hand. Instead of taking 20 minutes to configure one VLAN, four new VLANs were created in two minutes.
This successful use of automation was addictive. The next year, since we were using a similar data centre architecture, I spent more time building scripts to monitor the health of the data centre, the services running in it and the network itself. I started to build Python scripts that gathered data using SNMP, device APIs, and application APIs; processed the data; and then pushed informative messages to various Spark rooms monitored by NOC staff. This allowed all of us to proactively mitigate potential issues, such as routing table changes, DHCP pool exhaustion, interfaces going into an error-disabled state and devices becoming unreachable. We also created a bot that would tell you where a given user, MAC address or IP address was in the venue.Growth and change
Over the years, both Cisco Live and the network operations team have grown. Eight years ago, there were about 5,000 attendees and a team of 35 operations engineers. At the 2018 Cisco Live in Barcelona, there were about 15,000 attendees and a team of 70 network operations engineers, many of whom were new to Cisco and participating in the NOC for the first time. Each one of them played an important role in delivering an automation-centric, production-class network; many of them had feedback for how we can do better next year. It’s interesting to note that while the size of the event has tripled in eight years, the NOC team has only doubled. The capacity of the network, as well as the number of services, will continue to increase, and because of increased use of automation, the NOC team can more than keep up.Network engineering in transition
Network interactions are changing - from manual configuration to driven by automation and orchestration and now highly intent-focused. However, what will not change is the need for engineers at every skill level. Junior engineers will use more web-based portals or Application Programming Interface (API) invocations to interact with the network. They’ll use the terminal less. Senior engineers will use web-based tools, too, but they will be simulating new network designs and architectures. They will also use APIs to build custom integrations that tie the network tightly to the core business, creating a true digital differentiator.
The role and contribution of today’s network engineers are more valuable than ever. Tasks are moving from manual to automated, and the workflow is focused on faster, more scalable delivery. Engineers are now capable of and expected to quickly create reliable network interactions and spend more time on higher-value network designs and business integrations.
The network operations centre at Cisco Live provides an example of how the role of the network engineer is transitioning. The tables below offer more detailed insights into the changing roles at each level.Junior engineers Senior engineers
Joe Clarke, distinguished services engineer, Cisco Systems
Image Credit: Flex
You are what you eat.’ It is an often-quoted phrase, subtly telling us that if you want to be healthy, then you should eat healthy. But it also begs the question: do we truly know what we are eating?
Over the years, there have been a number of food scandals that have rocked consumer confidence in the food industry. In 2013, we had the UK horsemeat scandal. Two decades earlier, we had the mad cow disease epidemic as a result of contaminated British beef. Recently, an investigation by the Daily Telegraph revealed that “Meat free” vegan food sold at Britain's leading supermarkets actually contained traces of meat.
The above cases highlight that consumers are relying solely on the information on packaging and an implicit trust in the food industry when it comes to where their food is sourced from. In reality, they really have no idea. Food contamination is a constant threat for consumers, costing UK households £1.17 billion a year. For restaurants and supermarkets, such outbreaks not only drastically affect revenues but severely damage brand trust.Uncomplicating the food supply chain
Blockchain technology has been used by financial institutions to bring transparency and immutability to currency transactions. It is essentially a decentralised ledger that stores information in a way that prevents that data from being manipulated by anyone. There is no one centralised party overseeing the network of ledgers. Instead, blockchain databases have a set of protocols that every participant in the network must abide by to ensure trust and accuracy.
As it stands, the supply chain in the food industry is straining at best, wholly inadequate at worst. There are multiple intermediaries involved in a complex food supply chain, each with their own system for tracking what they do and documenting ingredients, processes and sources. As a result, data sharing within this heterogenous ecosystem is often limited to those who have direct contact with each other, resulting in incomplete and inaccurate information passing from stage to stage.
Applying a blockchain-based, decentralised system to the food industry’s supply chain will benefit all the participants that play a role in the harvesting, movement, processing and consumption of food. Retailers can easily identify the sources of contaminated food and deal with major contamination quickly and effectively. For food producers, it means they have a complete, end-to-end view of the entire food supply chain and can see what role intermediaries have played in provenance of produce.Revealing the backstory behind our food
Blockchain technology can provide consumers with a deeper understanding and appreciation of where their food comes from. Currently, consumers rely on food packaging or their own knowledge in order to choose what food they buy or eat. Even at restaurants and events, consumers choose what food to put on their plate based on what they know or what they read on a menu.
However, if consumers had access to the same supply chain data that food suppliers have, perhaps in a more simplified form, it would better inform them as to where their food has travelled before it ended up on their plate.
Like the intermediaries directly involved in the food supply chain, consumers can trust this data. Database systems, underpinned by blockchain technology, cannot be tampered with or altered. Due to the blockchain’s rigid set of rules and the absence of any centralised third party, all the data captured is immutable and accurate.
The application of blockchain technology in manner that would be complicated for consumers would be easily achievable. For example, by integrating blockchain technology into QR codes, consumers can use their smartphones to scan a food package or menu at the point of sale and, via an app or browser, see how their food has travelled across the supply chain, from where it was sourced to where it was stored.Ensuring the traceability of our food
If we truly are what we eat, then blockchain technology could finally enable to consumers to completely understand what exactly they are eating and where it has come from.
Despite the best intentions of brands, restaurants, supermarkets and others who operate within the food sector, the fact is that there are some grey areas when it comes to the traceability of food. Products can proudly label themselves as “British-made” however, the loose legal requirements around displaying food origin mean that this has just become a marketing ploy. On closer inspection, these “British-made” food products could actually be sourced in France and just cut and packaged in the UK, but consumers are none the wiser.
The capabilities of blockchain technology ensure that the origins of food cannot be hidden nor altered by food brands. Each item within a blockchain database is given a unique digital code or certificate which cannot be tampered with. By applying this methodology to the movement of food produce across a supply chain, it means brands will have no choice but to be completely transparent with customers.
Unlike the banking industry, which has lost considerable consumer confidence since the financial crash, the food industry has yet to suffer the same level of scrutiny and backlash. However, that isn’t to say that consumers wholeheartedly trust the food industry. In fact, there appears to be a trust deficit, as consumers are becoming increasingly sceptical about food companies and the whole food system in general.
The food scandals over the past two decades have brought to light the fragility of the food supply chain to consumers. Moreover, in recent years, consumers have become more health-conscious and are savvier about what exactly goes into their food. Within this consumer environment, the food sector has no choice but to be more transparent beyond labelling and packaging.
Food brands, restaurants and supermarkets must now be able to illustrate to consumers the journey of their food. Blockchain technology gives the food sector a means to not only correct the inherent deficiencies in the current supply chain system, but also educate consumers about how their food travels from its source of origin and onto their plates. While food contamination would not be permanently eradicated because of a supply chain system running on blockchain technology, it would enable brands to deal with such incidents more effectively and quickly.
Chris Painter, CEO, Omnitude
Image Credit: Zapp2Photo / Shutterstock
Banking and financial services are undoubtedly among the most heavily regulated sectors to work in -and for good reason. Companies in these sectors frequently handle the data of millions of consumers, not to mention businesses and even governments. From the new Second Payment Services Directive (PSD2) and the even newer EU General Data Protection Regulation (GDPR), to the Financial Services and Markets Act 2000 (FSMA) and the Payment Card Industry Data Security Standard (PCI DSS) there are many rules, regulations and directives with which organisations must comply. Some of them even appear to contradict one another.
The key to compliance generally lies in a mixture of procedures, policies and data management, which in almost all cases can improved and simplified by the application of technology. Below are some guidelines which will help.Leave data on the mainframe
PSD2, also known as the “open banking” regulation, requires banks to make some of their customer data available to other financial sector organisations. The legislation stipulates that payment account transaction and balance data, credit transfer initiation and account identity verification data must be made available to third parties. Sharing data, however, is difficult to do without compromising its security or integrity.
Transferring data from mainframes, which are used by the majority of banks, to either the cloud or a new digital server is fraught with risk, as TSB recently found out when migrating to a new system. As one IT professional from Lloyds pointed out in the wake of the TSB crisis, “the level of care required to migrate a bank onto different systems is perhaps almost uneconomic to perform in a way that keeps the risk small enough.”
By using APIs instead, modern digital applications can run using existing mainframe databases, while also ensuring that the data can remain in place, where it can be secured at source. APIs are already used to connect mobile and online banking services to a variety of customer databases and can be applied in a similar fashion to enable third parties to access information in accordance with PSD2.
Another option is using data virtualisation tools. These tools can enable the analysis of data “virtually,” while leaving the original records undisturbed in the database. Not only this, but you can also extract data from unstructured data sources in “green screen” terminal-based applications, by emulating the terminal data querying, in an automated process that accesses data and then encrypts it as it is transferred to a new system.
This approach also addresses many of the issues thrown up by the EU GDPR, which insists on the privacy and protection of personal information – a requirement that at first glance appears to be in direct contradiction with PSD2. By keeping only one version of each record on a database and avoiding duplication, it is much easier for banks to ensure the security of customer information. It also simplifies the process of deletion if a customer chooses to exercise their GDPR “right to be forgotten”.
Finally, keeping data on the mainframe helps with compliance with the element of the GDPR stipulating that banks must keep detailed records of which third parties they share customer data with and why. Banks must complete full audits of their Open Banking practices, a lengthy and costly process, especially given that most organisations already spend 20 to 30 per cent of their IT budgets on audit reporting and preparation. By keeping the data in place and creating a secure gateway through which third parties can access it, banks can remain compliant with both the PSD2 and the EU GDPR.Automate to stay great
Any organisations handling credit or debit card data have to comply with the PCI DSS. PCI DSS states that the development of internal and external software applications, must be completely secure, based on industry standards and best practices. In the fast-moving financial services sector, new software applications, or significant upgrades to existing ones, are taking place constantly. Financial sector organisations handling card data must therefore incorporate security measures throughout the software development lifecycle and ensure that this is documented fully in order to satisfy auditors. Any changes to software that handles user IDs, passwords and other personal information must be treated with particular care.
Once again, technology is the answer to this compliance headache. Application development can now be automated by application lifecycle management (ALM) systems. These programmes systematically record and document all actions taken by developers, including when and why these actions were taken. This makes life much easier for auditors, as all records are automated, kept in one secure location and saved in the same format. ALM automation therefore results in faster - and cheaper - audits. Some banks have even found that once auditors are told that they’ve implemented ALM automation, they can quickly tick things off their list and avoid further investigation of systems.There are other benefits, too
Naturally investing in new technologies will have other benefits. The Financial Times stated recently that a new wave of M&As has hit the banking world, with the chief reason being that banks are ‘crying out’ for technology investment. Automating your ALM system will make it easier in a merger, as it’ll make future application development easier to control, provide you with visibility, and make banks more agile.
Not only this, but with Fintech companies beginning to profit from access to customer data under PSD2, banks must innovate quickly and develop competitive digital products. This is why the Competition and Markets Authority introduced PSD2 in the first place.
With ALM automation, developers no longer need to spend time filling out these tedious reports of their daily activities; instead these reports are generated automatically. Removing these administrative distractions results in a more efficient development team – this can reduce the time spent by developers doing non-development, administrative tasks by up to 80 per cent. As ALM systems store all records on a shared digital database, everyone is kept informed of any changes that have been made to a project, reducing the likelihood of errors, bugs and product failures; this ensure that the principle of “least privilege” is followed, providing access only when an individual absolutely needs it and revoking it by default after a set period of time.
Overall, they allow a bank to stay competitive through freeing up their developers, while also having systems that help comply with PSD2, PCI DSS and GDPR – all while keeping their auditing and administrative costs to a minimum.Avoid that compliance headache
In many ways, banks still have a long way to go before they can ensure compliance with this long list of stringent regulations. GDPR and PSD2 are still two very new pieces of legislation and it may take a while before the world of financial services finds its feet with them. However, existing technology is the key that will bridge the gap, not only between banks and the auditors and regulators, but also with customers. As IBM stated last year, trust is the currency of the new digital economy; and if banks make sure they’re putting in place solutions which help them stay compliant, regulators and customers alike can rest easy.
Guy Tweedale is regional VP at Rocket Software
Image source: Shutterstock/MaximP
How have the recent privacy and security violations crowding the daily newsfeeds changed your company’s behaviour? There’s a silver lining to all the doomsday headlines — they should compel stakeholders in your company to pay more attention and provide more buy-in for proactive safeguarding activities against these risks. How are you going to leverage this opportunity? You need a fresh approach, management support, a solid plan, and comprehensive technology to support all the moving parts involved in setting up an integrated security and risk management program.
As an experienced governance, risk management, and compliance (GRC) consultant and former auditor, I’ve assessed and supported many companies through the challenges inherent to building a mature, enterprise-wide information security risk management program that aligns with global standards and boosts competitive advantage. One way many organisations are approaching this is through ISO 27001, an international standard for establishing, operating, maintaining and continually improving an Information Security Management System (ISMS). This standard pushes organisations to move past checking boxes for adherence to controls by promoting a top-down, risk-based approach to developing processes, policies, and controls that specifically address the organisation’s information security risks. Organisations are certified based on adherence to a set of process level clauses (requirements) and controls used to support the processes, and auditors certify against these requirements.Why try to certify?
I’ve seen a growing number of companies working toward ISO 27001 certification (or towards compliance without undergoing the certification process). Implementing this standard is a highly effective way to build an integrated risk management program by establishing an ISMS. An ISMS is comprised of the people, processes and IT systems used to apply a risk management program for managing an organisation’s most sensitive and valuable data. Approaching ISMS development in alignment with ISO standards will help your organisation protect its critical data and IT assets, build resilience against threats and incidents, and be prepared for challenges and opportunities as they arise.
Even though it is voluntary, ISO 27001 certification is a valuable undertaking for many reasons. ISO 27001 is highly recognised and respected worldwide, encourages continual improvement and serves as a solid foundation for other IT risk and compliance standards and frameworks. If you can meet the ISO 27001 standard, you are well positioned to comply with most other information security regulations, as well as client information security requirements.
At this point, organisations doing business globally are increasingly encouraged to achieve certification to stay competitive and win new business. As US companies expand operations internationally, they are often forced to comply with additional privacy and security regulations and provide additional assurances to partners and customers. In addition to being an important indicator of information security maturity, a certified ISMS operates as a marketing tool, and as a seal of approval, providing a competitive advantage over competitors. For evidence of this trend, do a quick search on ISO 27001 certification; note that the results are packed with company press releases announcing certification and re-certification.A high bar to clear
Many companies struggle to achieve certification. The ISO 27001 standard sets a high bar — it is not a one-and-done, checkbox list of requirements. It’s a continual living and breathing program that includes understanding interested party requirements, management commitment, cataloguing risks, assessing the severity of risks, planning how to remediate risks, and producing documentation to substantiate the risk management activities. The standard also requires that organisations apply a mindset of continual improvement, where management pushes past program mediocracy and strives to improve the overall health of the ISMS.
Traditionally, ISO 27001-related tasks have been performed manually; documents are stored in network file folders or process owner local drives and tasks are managed through spreadsheets, documents and email. It is nearly impossible for global, digital businesses to keep up using a manual approach, given the complexity of information security programs, the expanding reliance on supply chains and outsourcing, and the criticality of data and IT systems.
The pain points become acute when it is time for auditors to assess a company’s operations. Scrambling to pull together the proper documentation is a time-consuming hunt that distracts staff from core functions and operational improvement work. An inability to efficiently prove compliance, of course, increases the likelihood of failing an audit. This dynamic is disastrous enough for mandatory regulations like HIPAA and SOX. When it comes to voluntary standards like ISO 27001, failed audits, runarounds, and tedious tasks kill stakeholder enthusiasm and make it impossible to gain traction.
How can you bring focus and efficiency to your ISMS efforts, so you can build momentum towards certification? The key is to streamline, centralise, and automate. As a first step, consider your current processes to document and manage ISMS processes. If they are performed through manual ad hoc processes, then departmental segmentation, duplicated efforts, lack of visibility and accountability, and wasted resources are sure to follow.Integrated systems deliver lasting benefits
This is why a governance, risk management and compliance (GRC) technology platform is so critical to successful ISMS initiatives and efficient compliance programs. These enterprise software suites are comprised of interoperable tools that all types of organisations deploy to help manage risk, demonstrate regulatory compliance, automate business processes, and prepare for audits.
Streamlined documentation and automated tracking are key features of these tools. When a task (e.g., inventory, assessment, remediation workflow, exceptions approval, policy review, etc.) is performed within the tool, the tool automatically retains the required evidence, allowing GRC teams to gain significant efficiencies. In contrast, if you’re performing or documenting that task in Excel, it’s nearly impossible to show when or by whom that task was completed.
GRC platforms do far more than establish evidence repositories. They support the work of integrating processes, policies, and controls across departments and business units, which is essential to extending comprehensive risk management throughout the value chain. Digitally linking processes to risks you identify, to policies you create, and to control procedures you administer weaves a tighter web of protection and oversight. I see the “shall” requirement statements — the standards set by ISO 27001 and other security and risk management frameworks — as objectives. The processes, procedures, and controls you put in place and maintain with the help of a GRC platform determine if you will achieve those objectives, and how expedient you’ll be getting there.
GRC platforms, when combined with sufficient staff and expertise and supported from the top down, are instrumental in many ways. Whether your organisation is building an ISMS from the ground up, seeking a better method for managing and integrating security and risk activities, or trying to streamline the audit process after certification, manual processes will no longer suffice. Your team can leverage a GRC platform’s capabilities to manage regulatory requirements, policies and procedures, risk assessments, third parties, incidents, asset repositories, vulnerabilities, audits, and business continuity. When deployed across the organisation, GRC technology systems facilitate collaboration, and increase visibility and accountability. A team attuned to the importance of working together to develop a world-class ISMS can reach compliance and certification more expediently with these capabilities at its disposal.
These benefits are valuable to every organisation. Indeed, there are a lot of companies that will follow the ISO 27001 standards without attempting certification, but achieving the certification is the only way to provide assurance that your information security and risk management processes are compliant with the standard. The public, legislators, and industry organisations are increasingly aware of and reactive to negative news about corporate data breaches, and individual data privacy issues. Organisations that have built a mature ISMS that matches the standard of excellence set by the ISO will be well-positioned to sustain competitive advantage and protect their assets and reputation in the face of a myriad of challenges.
Jason Eubanks, CRISC, ISO 27001 Lead Auditor, Principal Consultant, Lockpath
Image Credit: Rawpixel / Pexels
Rackspace’s Kubernetes-as-a-Service (RKaaS) offering is now available, the company announced via a joint Rackspace/HPE press release . It says that businesses can enjoy ‘elastic infrastructure and simplified IT, in a private cloud environment located in their datacentre, a colocation facility or a datacentre managed by Rackspace’.
The company also announced that RKaaS will be based on a ‘pay-per-use’ infrastructure.
This pay-per-use feature took up most of the joint press release, signalling the importance the feature has for the two companies. It allows businesses to use just the right amount of resources, without needing to overpay as it sometimes happens when buying packages. “This flexible capacity model allows customers to take full advantage of the instant enterprise-level scalability of Kubernetes,” the pair says.
“We are setting the pace of innovation in the Private Cloud-as-a-Service market,” said Scott Crenshaw, executive vice president and general manager of Rackspace Private Clouds. “With RKaaS as the foundation, we’ve created a truly differentiated, first of its kind offering with a pay per use economic model. This will allow businesses to more easily transition new workloads into Kubernetes in a private cloud environment to help modernise application development, accelerate time to market and dramatically increase cost savings. This creates greater strategic flexibility within the enterprise and allows them to accelerate their digital transformation.”
Besides RKaaS’ pay-per-use infrastructure, Rackspace also announced the release of the Private CloudTM.
HPE said they were ‘excited to partner with Rackspace’.
VMware could soon be launching its own blockchain service according to the company's recently published content catalogues for its VMworld conferences in Las Vegas in August and Barcelona in November.
The catalogues offer more details on upcoming sessions at this year's VMworld conferences and attendees at the event in the US will be able to attend a session titled “VMware enterprise blockchain – getting started” along with a workshop of the same name.
The company has expressed interest in blockchain for a while now though it wants to move beyond the proof-of-work most often used for blockchains. A VMware Research blog shows that the company is actively exploring enterprise blockchain and a video on the site hints that a real offering could be revealed soon.
The Barcelona catalogue also includes a session called “Introduction to the VMware Cloud Marketplace” that describes the marketplace as a new service that “extends the route to market you enjoy in VMware Solutions Exchange into the cloud.” Solutions Exchange is the company's marketplace that allows software partners to sell their products and this could mean that VMware is going to provide its partners with added help.
Both VMworld conferences offer more than 1,000 sessions combined and attendees will likely get a glimpse into the company's enterprise blockchain service in either August or November.
Image Credit: ITProPortal
Brian Krzanich is no longer the CEO of Intel, the media reported this Thursday. As it turns out, he was romantically involved with an employee sometime in the past, which is against the company code of conduct.
“Intel was recently informed that Mr. Krzanich had a past consensual relationship with an Intel employee,” the company said in a press release. “An ongoing investigation by internal and external counsel has confirmed a violation of Intel’s non-fraternization policy, which applies to all managers.”
Intel decided to accept the now former CEO's resignation, to show that “all employees will respect Intel’s values and adhere to the company’s code of conduct.”
If that were truly the case, the relationship wouldn’t have happened in the first place, but…
Following the resignation, Intel removed Krzanich’s biography from the company website and started looking for a replacement. Both current employees, as well as outside candidates, will be evaluated, it was said.
In the meantime, the company’s CFO Robert Swan will take over as interim CEO.
“The board believes strongly in Intel’s strategy and we are confident in Bob Swan’s ability to lead the company as we conduct a robust search for our next CEO,” Intel chairman Andy Bryant said. “Bob has been instrumental to the development and execution of Intel’s strategy, and we know the company will continue to smoothly execute. We appreciate Brian’s many contributions to Intel.”
Image Credit: Michael Moore
Following years of economic upheaval, UK CFOs are now optimistic about national business performance in 2018 according to new research from American Express and Institutional Investor.
This year's Global Business and Spending Outlook surveyed 100 senior finance executives in the UK to reveal that UK CFOs are amongst the most confident in Europe. 91 per cent of respondents anticipate domestic economic expansion this year with 38 per cent expecting substantial growth.
In this year's study, 74 per cent of UK CFOs reported increased revenues over the last 12 months versus 54 per cent globally. This confidence is also reflected by the fact that the country's CFOs plan to increase spending and investment in their businesses by 8.6 per cent over the course of the year.
While confidence has increased among CFOs in the UK, many remain watchful with 83 per cent concerned that widely unanticipated economic or political events could negatively affect their businesses. The research also revealed that uncertainty outside the UK is more of a concern that risk from within it though half of respondents (53%) said that they would be unlikely to withdraw from high-risk countries in response to either political or economic risk.
To help combat uncertainty, CFOs have begun to take careful steps to manage and mitigate risks. Of those surveyed, 63 per cent said they are focused on moderate and controlled spending as opposed to aggressive investment. CFOS plan to increase time and resources dedicated to enterprise-level risk management systems (59%) and managing risks through insurance and hedging tactics (65%).
Executive VP at American Express Global Commercial Services, Jose Carvalho commented on CFOs return to normalcy after a long period of economic uncertainty, saying:
“While there’s no denying that we have a long way to go until the political and economic concerns of the UK’s business leaders are pacified, it is hugely encouraging to see leading CFOs’ focusing on shifting back to operational norms. In particular, a focus on how they can better serve their customers suggests that, after weathering a lengthy economic storm, CFOs have returned to thinking about long-term success. This can only be good news for the future of UK business.’’
“Our study shows that CFOs rightly recognise that technology has a big role to play in helping their businesses stay ahead – with almost three-quarters (74%) saying they have already invested in AI. To remain competitive on the world stage, it will be important for CFOs in the UK to continue to balance investment in innovation against the costs associated with mitigating risk.’’
Image Credit: Everything Possible / Shutterstock
Today making a voice call is now one of the least likely things you’ll do with the device you call a phone. Instead, we use our phones to message friends, check social media, order food and flick between media channels, be that Spotify, Netflix, YouTube or The Guardian. And most importantly, our phones are full of different apps that we hold subscriptions to. In reality, to be comfortable we no longer live pay cheque to pay cheque, but subscription to subscription.
Today’s consumers are less interested in owning products, and instead are more concerned with subscribing to services. They want control over a vast multitude of products and they want the flexibility to adjust them to reflect personal preference. But unlike before, they’re not interested in possessing them in the long term, they want them until they no longer find them necessary – or even until they get bored and want to move on. We’ve seen this mentality trickle down to create a new product model that has transformed several industries, including entertainment with Netflix and Spotify, catering with HelloFresh and automotive with Volvo’s Care subscription service.
According to Tien Tuzo, our ZEO and author of SUBSCRIBED, the shift away from a product-centric model to that of a customer-centric model is the "defining characteristic of the subscription economy".
This affinity towards subscription services isn’t an isolated trend – as a whole, the subscription economy is booming and is now valued at over €350 billion a year, the equivalent of 5% of all household spending within the EU.Subscribe to a new type of consumer
The millennial generation is a prime example of a group that is particularly interested in owning as little as possible. Why? Because they aren’t as rooted as the generations before them. They place significant importance on the convenience of having instant access to what they need.
Think back a decade or so ago. Most kids would spend years saving up until they could afford a car that would be able to get them from point A to point B. Today? Teenagers spend their money elsewhere and often depend on popular ride sharing programs such as Uber and Zipcar.
This transition away from long-term possession is also known as the end of ownership.
And while this may be being heavily driven, so to speak, by millennials, it’s not limited to them – the end of ownership is reaching all generations. Being able to subscribe to a service and pick up the products when and where they want them, without actually physically owning them, has given consumers of all ages a sense of freedom that did not exist before. So much so that according to a recent report by Zuora, 78% of consumers over the age of 55 already claim to have a subscription to a service.Subscribe to a new business model
Spotify’s recent IPO was not only a testament to its commercial success, but a strong indication of how businesses who have embraced a subscription business model are reaping the rewards of a loyal fan-base and consistent customers.
Another example? TV streaming services. According to the same report, almost twice as many 16-24 year olds subscribe to VOD services (47%) as they do traditional TV licences (25%). One company that listened to the demand? Netflix. Once a DVD service, they completely flipped their business model to reflect the growing affinity towards streaming services. And it’s working. Just in the first quarter, Netflix’s revenue shot up to an unexpected level of £2.6bn with over 5.4m new overseas users.
Netflix’s success and Spotify’s unique IPO illustrates how companies that embrace this new approach to business are beginning to beat out the FAMGA’s of the world to become non-traditional yet established market leaders.Subscribe to a new level of innovation
A conventional business model supports the idea that when a customer makes a purchase, they will forever own that product as it is. This meaning that if a brand updates its product with new innovations, in order to reap the benefits, the consumer would have to buy the new product in its entirety.
Once again, the perfect example of this is within the automotive industry. If a consumer purchases a Volvo, and then one year later the company releases an upgraded version, that car that was originally purchased is already considered to be out of date. Unless the consumer goes through the various complicated steps required to sell the original car, they’re left with an old version and likely no opportunity to enjoy the amenities of the upgraded version.
On the other hand, when a business offers services rather than products, the technological innovations can be automatically offered to the consumer – with little to no hassle. Meaning that consumers are consistently receiving top level customer experience and in turn, are likely to remain loyal in the long term to a particular brand.
Volvo however, has its “Care by Volvo” service. The subscription model allows the customer to choose his/her favourite car in the moment; it does not require them to buy it. Simply, he/she subscribes to the service, allowing them to use it when needed. Meaning that when Volvo releases a new and improved version, the consumer has the immediate flexibility to change cars.
As an early adopter of the subscription model in automotive, Volvo has the opportunity to nurture loyal customers who appreciate the flexibility and rich consumer experience it offers, ultimately making it harder for their head to be turned by competitors.Subscribe to the subscription economy
The success of Volvo, Spotify, Netflix and their forward thinking peers show the that subscription economy is here, and the early adopters of it are already seamlessly disrupting industries. Those businesses that don’t consider utilising a subscription service model are more than certainly going to find themselves left behind and unable to transform in time to keep up with their customers changing behaviour.
Because in reality, if you’re not catching up to Amazon Prime for retail, Spotify for music, and/or Netflix for TV, you’re already too late.
John Phillips, Managing Director for EMEA at Zuora
Image Credit: Denys Prykhodov / Shutterstock