In part one of this discussion on data protection and technology we looked at some of the ways that you can protect your personal data while browsing the internet and shopping online. This part looks at the shift to cloud technology and protecting data stored in the cloud or on in house servers.
What is Cloud Technology?
Cloud technology has been around for many years now, but the levels of trust in the security of the system and understanding of cloud-based technology varies from person to person and company to company.
Cloud-based software, simply put, is software that is stored on servers owned or leased by the software provider. The servers are typically held within secure and climate controlled third-party data centres, and all you need to access the software is an internet connection and the software provider takes care of the rest. You typically pay a subscription fee for the software and access it much the same way that you would access a website.
In House Servers
Until relatively recently, businesses that use software packages and share files and folders across their business network would have needed an in house server and a network of workstations with unique addresses. If set up correctly, a workplace network is a simple way of sharing data among employees and does not require an internet connection to operate.
With increased technology (and access to an internet connection or mobile data network) Virtual Private Networks (VPNs) and Remote Desktop Connections enabled companies to share a single network with multiple physical locations, both nationally and internationally.
Server hosting is a bit of mix of the above, and is a service offered by network providers who run all the software that you would ordinarily house on your internal server on a remote server that they either own themselves or lease. You may have a server dedicated to your company, or you may share a partition of one with someone else. You typically rent/lease an amount of data storage space, same as you would rent/lease office space.
As with cloud-based software, you need your own personal computer, laptop or tablet and a reliable internet connection to access the hosted server.
Which is Better for my Business?
For many people there is something comforting about having a large server ticking away within a data room on your own premises. You know that your data is sitting in your own building, you are in control of its fate – good and bad, and you are not dependent on a third party provider and the internet speed and stability in order to get your daily work done. But, and this is an important but, you need to protect your hardware, software and data; many companies are at risk of losing its data through inappropriate backup schedules, insufficient hardware maintenance, power surges, viruses, spyware, hacking and a host of other factors.
Although high-end in house servers can be extremely expensive, and the cost of maintaining them can be high, if you are in an area where you do not have fast and reliable internet this might be your only option. Even if you do have good internet, your own server can be a more cost-effective solution for small businesses, and a lower spec server or a powerful PC might suit all your needs.
Solid state drives offer faster, smaller and longer lasting computers, which may be an option for your in-house server, but these advantages come with a trade-off. Larger capacity solid state drives are expensive, especially for the better brands, which means that storing large amounts of data locally can be very expensive, and increasing your data storage capacity can be complicated.
Cloud-based systems (including hosted servers) easily allow for multiple users to access your important data in real time, from any device, increasing productivity, access to information and user independence. This reduces business risk and ensures a level of flexibility that on-premises equipment simply can’t offer. You would typically have a known cost per month to access the system and extra storage/users can be added as and when it is needed.
Providers of cloud services are responsible for a broad set of policies, technologies, applications and controls in order to protect the internet portals you access your data through as a client. They are responsible for ensuring the compatibility of the applications and services they provide with the browsers through which you access them. They are also responsible for the security of your information and take care of hardware maintenance, data backups and related services for you.
Although there are many pros and cons of each type of system, and an initial assessment may suggest that the on-premise solution is cheaper, if all factors are considered, cloud-based technology offers much greater value and flexibility.
A Common Sense Approach
Regardless of what you decide, you still need to have systems in place to prevent data breaches and potential losses. In part one of this series, we discussed how poor password security is responsible for over 80% of data breaches, but leaving computers unlocked, having inadequate virus and spyware protection and sharing your login details with other people can lead to big problems.
Even though you might have the latest and the best virus and spyware protection installed, the software you have is always one step behind the bad guys. To put it another way, the antivirus needs the virus to exist in the first place for it to be needed, so never ‘assume’ that you are protected from the suspicious email you are about to open.
How does TigerFleet Store and Protect your Data?
TigerFleet’s main database is hosted on Microsoft Azure servers. Microsoft Azure has the largest global network, servicing 55 regions and 140 countries around the world. Each region is a set of data centres that are interconnected via a massive and resilient network. The network includes content distribution, load balancing, redundancy, and encryption by default.
Azure regions are organized into geographies, and each geography ensures that data residency, sovereignty, compliance, and resiliency requirements are honoured within geographical boundaries. Geographies are fault-tolerant to withstand complete region failure, through their connection to the dedicated, high-capacity networking infrastructure.
Microsoft’s datacenters comply with key industry standards, such as ISO/IEC 27001:2013 and NIST SP 800-53, for security and reliability, and are managed, monitored, and administered by Microsoft operations staff. The operations staff has years of experience in delivering the world’s largest online services with 24 x 7 continuity.
TigerFleet ensures that data stored with Azure is encrypted in accordance with their standards and maintains control of the keys that are used by its cloud applications to encrypt data. Encryption of data in storage and in transit is deployed by TigerFleet as a best practice for ensuring confidentiality and integrity of data. TigerFleet uses SSL to protect communications from the internet and even between their Azure-hosted VMs.
TigerFleet has opted for Geo-redundant storage (GRS) with Azure. GRS maintains six copies of your data. With GRS, our/your data is replicated three times within the primary region. The data is also replicated three times in a secondary region hundreds of miles away from the primary region, providing the highest level of durability. In the event of a failure at the primary region, Azure Storage fails over to the secondary region. GRS helps ensure that data is durable in two separate regions.
If a customer closes their account, they can request to have all of their data destroyed immediately. If this is not requested, their data is retained by TigerFleet for 12 months, which allows the client to export all of their data to Excel if they wish to use it elsewhere (e.g. upload to a new provider). At the end of this period, however, the data is destroyed.
Why Microsoft Azure?
Access to customer data by Microsoft operations and support personnel is denied by default. When access to customer data is granted, leadership approval is required and then access is carefully managed and logged. The access-control requirements are established by the following Azure Security Policy:
Azure provides customers with strong data security, both by default and as customer options. Azure is a multi-tenant service, which means that multiple customer deployments and VMs are stored on the same physical hardware. Azure uses logical isolation to segregate each customer’s data from the data of others. Segregation provides the scale and economic benefits of multi-tenant services while rigorously preventing customers from accessing one another’s data.
Microsoft helps ensure that data is protected if there is a cyberattack or physical damage to a datacenter. This includes in-country/in-region storage for compliance or latency considerations, and out-of-country/out-of-region storage for security or disaster recovery purposes.
When customers delete data or leave Azure, Microsoft follows strict standards for overwriting storage resources before their reuse, as well as the physical destruction of decommissioned hardware. Microsoft executes a complete deletion of data on customer request and on contract termination.