Archive for the ‘government’ Category

Zero Trust will be the Leading Strategy for Cybersecurity and Risk Management in 2023

DoD and Forrester emphasize role of Zero Trust as cybersecurity strategy

Strengthening Vulnerable Cyber Infrastructure

Zero Trust has entered the cybersecurity fray as a leading solution to mitigate and reduce vulnerabilities. This strategy is relevant for IT infrastructure all over the world: a recent Radware report establishes that over 99.5% of global organizations deploy applications in the public cloud[1].

However, public and multi-cloud environments pose significant risks when it comes to data leaks and breaches. The same report states that “69% of organizations can trace data breaches or data exposures to inconsistent application security configurations across the different public cloud platforms.”

Both the public and the private sector have already witnessed how expensive these breaches can be, in terms of lost productivity, reputational damage, IT repair/mitigation, and ransom costs.

Sophisticated Cyberattacks

Incidents like WannaCry in 2017 showed just how strong an impact cyberattacks can have, with computers in over 150 countries affected[2] and an estimated cost of $4 billion globally. The ransomware spread across industries as well, including healthcare, education, manufacturing, financial services, and telecommunications.

Costs associated with cybercrime have only increased in the years following, with larger entities targeted. Research collected by Ivanti showed that ransomware has increased by 446% since 2019[3]. In 2022 alone, major organizations like the Red Cross[4], Toyota[5], Twitter[6], and CashApp[7] have reported breaches, with records in the tens of millions affected. The Irish Data Protection Commission recently fined Meta[8] for GDPR violations to the tune of €265 million for exposing PII of over 533 million users.

Threat of Pipedream

In April 2022, the Department of Energy, the Cybersecurity and Infrastructure Security Agency (CISA), the NSA, and the FBI issued an advisory for a malware toolkit dubbed Pipedream[9], “the most versatile tool ever made to target critical infrastructure, like power grids and oil refineries.” This toolkit was designed to target and cripple industrial control systems in critical infrastructure sectors.

Dragos, an industrial cybersecurity firm that helped analyze Pipedream, affirmed at Forrester’s 2022 Security and Risk conference[10] that cyber-attacks are increasingly being carried out by nation-states, targeting critical infrastructure sectors, including chemical, manufacturing, and energy plants.

Thankfully Pipedream was evaded by proactive cybersecurity measures and patches before it could be maliciously deployed. However, this is one example of how cybercrime will be used by nation-states, with the trend likely to increase as cyberattack strategies are improved. By carrying out remote attacks, nation-states can potentially debilitate and undermine another country’s ability to react and defend, all while denying responsibility.

It’s a new phase of warfare that isn’t all that new – countries have always used shadow entities to handle less than savory missions; software has simply become the most recent tool of choice.

Modern Problems Require Modern Solutions: The Dawn of Zero Trust

In their keynote address at the Forrester Security & Risk Conference, Renee Murphy and Allie Mellen cited internal reports that revealed “business continuity is the number one priority for cybersecurity teams over the next 12 months.”[11] The overlap between business continuity and cybersecurity is trust. Yet it’s not enough for businesses to have a robust cybersecurity strategy; they must also have consumer trust.

7 Levers of Trust: Accountability, Consistency, Competency, Dependability, Empathy, Integrity, Transparency

Ironically, the way we build consumer trust is by establishing a policy of not trusting anyone, otherwise known as Zero Trust. This framework is highlighted as the leading strategy to ensure business continuity by preserving consumer trust and effectively responding to evolving threats. It accounts for the evolving and fluid nature of the network edge, otherwise defined as the point of connection between a device or local network and the internet.

Connections between devices, applications, and cloud, on-prem, and hybrid networks are only increasing, which makes this network edge vulnerable. Organizations must also factor in remote work connections, hybrid cloud networks, and increased risk of cyberattacks or malware exposure. These connections and risk factors make securing the network edge ever more difficult for system admins.

How Does Zero Trust Work?

Zero Trust is a system of “least privilege” where users only have access to the data they absolutely need. This permission must be actively enabled or allowed, and the default status is to deny access. This ensures no unauthorized access to sensitive or confidential information.

A Zero Trust framework operates on a principle of continuous identity verification and least privilege access. In effect: anyone accessing the network must be authenticated (not just once, but consistently) and they will only have access to the data they absolutely need (to contain the damage in the event of a breach).

One of the major benefits of Zero Trust is that it provides protection against possible data leaks and breaches, including those stemming from insider threats. Joseph Blankenship, Research Director at Forrester, stated that “26% of data breaches are caused by insider incidents, most of which are malicious”[12].

Forrester Analysis of Zero Trust

Over the next three years, Forrester analysts anticipate that the weakest points of IT security will remain individuals, with a need for identity-focused protection (“identity as a perimeter”)[13].

As part of the Forrester panel on insider risk, Dr. Caputo emphasized that adversaries are looking for targets inside organizations struggling with psychological-financial strain: “it’s not how much debt someone has, but how that debt makes them feel.”

This is where the full concept of Zero Trust shines, not just as a technology solution but as a cultural mindset. By using a model of least privilege and repeated verification, granting data access can become a more granular process. Stronger, built-in controls and protections help make processes around using data and collaborating with teams more secure, without compromising productivity.

Department of Defense Embraces Zero Trust

The U.S. government has been hinting at their investment in an updated cybersecurity strategy across various departments for several years:

  • 2018 – CISA formed as a branch in the Department of Homeland Security to focus on the government’s official cybersecurity posture.
  • 2020 – Cybersecurity Maturity Model Certification (CMMC) program launched by the DoD.
  • 2021 – an Executive Order was issued, mandating investment and restructuring of federal information security systems.

The Executive Order explicitly included references to Zero Trust framework as part of the updated cybersecurity solution. CISA advisories have also urged government and private sector organizations to begin developing Zero Trust security strategies.

Most recently, the Department of Defense released their Zero Trust Strategy and Roadmap for implementation by FY 2027. This roadmap includes base level and advanced Zero Trust targets across seven pillars: user, device, application & workload, data, network & environment, automation & orchestration, and visibility & analytics.

7 Pillars of Zero Trust by US DoD

Other government departments will follow suit to create comprehensive security for the entire network surface, along with global and local governments and the private sector.

This adoption cascade will create a more resilient, responsive cybersecurity network across industries, sealing dangerous loopholes and preventing data leaks that could possibly lead to catastrophic data breaches. Zero Trust is the framework that provides both a technological and cultural goal post for the coming years.


Article written by Katie Gerhardt, Jr. Product Marketing Manager



[1] Radware. “Application Security In A Multi-Cloud World.” Retrieved 29 Nov 2022 from

[2] Kaspersky. “What is WannaCry Ransomware?” Retrieved 29 Nov 2022 from

[3] Louis Columbus. VentureBeat. 20 Oct 2022. “Ransomware vulnerabilities soar as attackers look for easy targets.” Retrieved 30 Nov 2022 from

[4] International Committee of the Red Cross, 24 June 2022. Retrieved 29 Nov 2022 from

[5] James Coker. InfoSecurity Group. “Toyota Reveals Data Leak of 300,000 Customers.” Retrieved 29 Nov 2022 from

[6] Twitter. 5 Aug 2022. “An incident impacting some accounts and private information on Twitter.” Retrieved 29 Nov 2022 from

[7] Trend Micro. 7 Apr 2022. “Cash App Suffers Data Breach Affecting 8.2M Customers.” Retrieved 29 Nov 2022 from

[8] Sumeet Wadhwani. Spiceworks. 29 Nov 2022. “Meta Fined $275M for Failing to Protect the Data of 533M Facebook Users.” Retrieved 29 Nov 2022 from

[9] Andy Greenberg. WIRED. 13 Apr 2022. “Feds Uncover a ‘Swiss Army Knife’ for Hacking Industrial Control Systems.” Retrieved 29 Nov 2022 from

[10] Robert Lee. Forrester Security & Risk Conference. 8 Nov 2022. Keynote Address: “ICS Threats: From Pipe Dream to PIPEDREAM.

[11] Renee Murphy and Allie Mellen. Forrester Security & Risk Conference. 8 Nov 2022. Keynote Address: “Securing the Future: Geopolitical Risk will Redefine Security Strategies for the Next Decade.”

[12] Joseph Blankenship (Forrester), Alla Valente (Forrester), Dr. Deanna D. Caputo (MITRE), Ryan Boyer (CISA). Forrester Security & Risk Conference. 9 Nov 2022. Keynote Panel Discussion: “Insider Risk Reduction Requires Two Parts Culture, One Part Security.”

[13] Laura Koetzle. Forrester Security & Risk Conference. 9 Nov 2022. Keynote Panel Discussion: “Take a Zero Trust Approach to Threat Prevention, Detection, and Response.”

Invalid Privacy Shield – What is at Stake?

The Court of Justice of the European Union “CJEU” on July 16 invalidated the  EU–U.S. Privacy Shield. The use of data transfer methods, in particular between the European Union and the United States is questioned. In 2000, the EC put in place an adequacy mechanism known as the “Safe Harbour” for personal data transfers to the U.S. It was invalidated by the CJEU in 2015 due in large part to U.S. surveillance practices that arose in the wake of 9/11. It was replaced in 2016 by the Privacy Shield, which aimed to address the concerns that the CJEU outlined in its Schrems I judgment. The Court also looked at SCCs. While the CJEU did not invalidate this mechanism, it did underline that it is up to the exporting and importing organizations to verify that the legal system of the country where the recipient organization resides provides sufficient safeguards.

Under Privacy Shield, U.S. companies guaranteed that they would meet seven principles when handling EU-governed personal data, which included:

  • Notice: Individuals must be notified about the collection and use of their personal information.
  • Choice: Organizations must give individuals the opportunity to opt-out of the disclosure of their personal data to third parties.
  • Accountability for Onward Transfers: Organizations are accountable for applying the notice and choice principles in order to disclose personal data to third parties.
  • Access: Individuals must be able to access their personal data being stored by an organization.
  • Security: Organizations must protect personal data from loss, misuse, unauthorized access, and disclosure.
  • Data Integrity: Organizations must ensure data is reliable and relevant for the purpose it is being used.
  • Recourse, Enforcement, and Liability: Individuals have the right to affordable recourse mechanisms if they believe their personal data has been misused.

Privacy Shield framework governing the transfer of personal data from Europe to the United States is no longer valid. This ruling is very similar to a ruling five years ago that similarly undid the predecessor to the Privacy Shield framework, the Safe Harbor. The thinking in both of these decisions (nicknamed Schrems I and Schrems II after the plaintiff) is that, because surveillance is such a consistent part of American life, and because the government has such easy access to data from large companies and their affiliates, the likelihood that European personal data would be protected and/or only utilized in ways that were understood was fairly low.

Although Privacy Shield was invalidated, SCCs are still permitted for the transfer of EU personal data outside of the EU. However, these clauses are merely a data transfer tool, so organizations must ensure, prior to any data transfers, that there is an adequate level of protection against U.S. government surveillance. The CJEU also emphasized three stakeholder obligations:

  1. Data exporters are responsible for verifying the importer’s ability to provide an equivalent level of data protection in the third country.
  2. Data importers must notify exporters if they are unable to comply with the SCCs.
  3. Data exporters must suspend or terminate the transfer if the importer gives notice that they cannot comply with the SCCs.

In order to determine which new data transfer mechanism should replace Privacy Shield, you need to understand how your company collects, stores use, and transfers data. Implementing a robust data governance strategy can help your organization build processes and policies for managing data, evaluating third parties, and even monitoring regulatory change. With the help of the NIST Privacy Framework, your organization can improve its approach to using and protecting personal data and determine which data transfer mechanism aligns best with your organization’s business needs.

The first and most important thing to do is, if you were a company previously certified under Privacy Shield, to identify every contract and every relationship that involves the transfer of personal data from the European Union to the United States. If all the data that you’re transferring is transactional, for instance, and doesn’t touch on anything that could be considered personal data, Schrems II isn’t really going to have much of an effect on you. But most businesses do collect some form of personal data, even if it’s only the business address and contact information of their partners. In that case, once you’ve identified which relationships are a source of data from European citizens, you have to review your agreements and determine whether you want to continue to receive that data and, if so, how the SCCs could be implemented. Because many data relationships are bilateral both in benefit and cost, partners may be willing to work with you to rapidly ensure that no interruption in business occurs. But there will always be some instances where this opportunity for leverage to renegotiate a deal won’t be missed.

With FileCloud Online you have the option to host in secure, world-class data centers. Data storage is available in the region of your choice: US, EU, Canada, Australia, APAC, and SE Asia to meet data residency requirements. Also, with FileCloud On-Premise you can use your existing infrastructure to store and share data abiding by EU data protection laws.

Is AWS GovCloud an ITAR Compliant Cloud Services Platform?


The International Traffic in Arms Regulations (ITAR) are rules which pertain to individuals and companies that deal with defense technology, services, or technical data. This includes documents, schematics, photos, and other materials included in the United States Munitions List (USML). The guidelines were created to prevent confidential material (with possible defense and space application) from falling into the hands of non-U.S. citizens. You may be thinking – but Amazon Web Services (AWS) is a cloud service platform. So why does it need to be ITAR compliant? Well, AWS provides cloud services, but it has no direct control over what its users store on its platform. Since there may be people in the defense industry who are interested in using AWS to store and transmit data, the company created AWS GovCloud.

What is AWS GovCloud?

AWS GovCloud is ITAR compliant. It was developed for individuals and companies who deal with data subjected to ITAR rules. In keeping with ITAR regulations, AWS GovCloud is an isolated cloud platform and has its servers located within U.S. territory. Also, AWS only allows its workers who are U.S. citizens to access the platform. Additionally, AWS works with a third-party organization to assess and validate that AWS GovCloud is compliant with ITAR guidelines. AWS GovCloud has no ITAR certificate, but it has been awarded a provisional authority to operate a platform for ITAR data by the Joint Authorization Board (JAB).

AWS GovCloud accounts are only available to U.S. nationals. There is a vetting process for the primary account holder before access to the platform is granted. The goal of the platform is to make it easier for companies that deal with ITAR data to take advantage of modern technology without violating the law.

Here are a few advantages of using AWS GovCloud

A. Makes it easier to comply with ITAR regulations.

It can be challenging to adhere to ITAR regulations as far as data management is concerned. While it can be easy to physically restrict access to certain documents, how can you do so in the virtual world? This is where AWS GovCloud comes in. Using this platform, you can effectively cordon access to USML data and avoid flouting the ITAR regulations. Also, since AWS GovCloud is separate from other Amazon Cloud services, you can keep your USML-related documents isolated to avoid a mixup. AWS GovCloud is not only ITAR compliant but also adheres to FedRAMP regulations.

B. Total security.

All the data on AWS GovCloud servers are encrypted. The platform supports FIPS 140-2, which guarantees that your data will not fall into the wrong hands during storage and transmission. Given the recent spate of hacking incidents, data encryption has become more critical, and AWS GovCloud encryption is top-of-the-line.

C. Control access to sensitive data.

AWS GovCloud allows you to control access to sensitive data. You can limit access to specific individuals or to particular times of the day and location. AWS GovCloud also gives you an overview of the individuals that have access your data on the platform. This complies with the regulation that all ITAR-related data must be monitored, and audited.

All these features make it easy to maintain an ITAR compliant status. It is important to note that the U.S. government takes compliance with the ITAR very seriously. In the past, companies who violated these guidelines have been fined millions of dollars. The penalty for not adhering to the ITAR is up to $500,000 in civil cases and $1,000,000 in criminal cases per instance of violation as well as an imprisonment sentence of up to 10 years.

Why is AWS GovCloud Important?

It can be difficult for companies to restrict access to data on a public cloud platform. There are chances that your non-U.S. workers may inadvertently open certain documents that they’re not supposed to. This is the reason why AWS GovCloud exists. It makes the process of complying with the ITAR easier. 

Are There Exceptions to ITAR?

Technically, everything in the USML is subject to ITAR and are not to be exposed to non-U.S. nationals. However, the ITAR can be extremely difficult to enforce in some situations. We live in a globalized world. There are multinational companies. Also, the internet and increased migration have made it easier than ever for companies to hire foreign experts. As a result, the U.S. State Department can grant exemptions to some individuals. Also, countries like the UK and Canada have a standing agreement with the U.S. that covers ITAR so people from these countries can be permitted to access data in the USML.

How to Avoid Violating ITAR with AWS GovCloud

As indicated above, it can be tricky to enforce ITAR in an organization. However, these steps can help you avoid running afoul with the law.

The very first step is to identify the documents in your database that are covered by the ITAR. You can then restrict access to them. It is also advisable to indicate on the documents that they are covered by ITAR to ensure your workers do not mistakenly share them with unauthorized persons. Most importantly, you must educate your workers on the importance of ITAR and lay down policies on how documents that fall under the USML must be treated to avoid breaking the law.

If you plan to export ITAR data and materials, you need a license from the State Department. In some situations, transferring data to a server in another country may also be categorized as export. Therefore, you need to consult a lawyer and other experts on the subject. Remember, due diligence is crucial. You will be held accountable for sharing USML data with any non-U.S. person or company even if they are based in the country.

Ultimately, AWS GovCloud is not responsible if USML data on its platform is shared with unauthorized persons or if you wrongly provide access to non-U.S. citizens. AWS is only accountable for the integrity of its servers. It is up to you to take the necessary precautions in terms of accessing and sharing data when using AWS GovCloud.

Author: Rahul Sharma

Why US Government Organizations Should Move to Private Cloud



Since its inception, cloud computing has managed to transform the business landscape in unforeseen ways. While the private sector has been capitalizing on the multiple benefits of cloud computing for a while now; government organizations have also aggressively started to embrace the cloud. As it stands, the IT environment of most government organizations is typified by poor asset utilization, duplicative processes, a fragmented demand for resources, poorly managed environments, and prolonged delays in getting things done. The end result is an in-efficacious system that has a negative impact on the organization’s ability to serve the American public. The innovation, agility and cost benefits of a private cloud computing model can significantly enhance government service delivery. A move to the cloud for government organizations directly translates to public value, by improving operational efficiency and the response time to constituent needs.

 The Cloud First Initiative

In February 2011, the first Federal CIO, Vivek Kundra, announced cloud first. The policy was presented as a crucial aspect of government reform efforts to achieve operational efficiencies by cutting the waste and help government agencies deliver constituent services in a more streamlined and faster way. Up to 2014, the adoption rate was slow. A 2014 report by the U.S Government Accountability Office showed that only 2% of IT spending went towards cloud computing that year. However, the tide has shifted in recent years. Agencies across the federal government have espoused cloud computing solutions and architectures to facilitate services to constituents and reduce the reliance on the large-scale, traditional IT infrastructure investments.

Currently, AWS reports that GovCloud, has grown 221% year-over-year since it was launched in 2011. Microsoft also claims that Microsoft Cloud for Government, which includes office 365 government, Dynamics CRM, and Azure Government. Has attracted over 5.2 million users. Despite its palpable success, Cloud First has had its share of critics, including those censorious of the trouble-prone launch of Critics have blamed the perceived slow adoption on the lack of federal technical experience in cloud deployments. Below are some of the compelling reasons why government agencies should adopt a private cloud computing model.

I. Reduced Infrastructure Costs

By consolidating server footprints via virtualization and cloud efforts, govt agencies significantly reduce the cost of IT ownership. Agencies that operate in house IT gear have to deal with data center security on top of hardware, software and network maintenance. These are all resource intensive workloads that cloud vendors handle on behalf of their clients. The minute an agency offloads all of it, it free itself up to focus on the particular capabilities and features it has to offer. Private cloud computing solutions are typically bundled with asset management, threat and fraud prevention and detection, and monitoring programs. Adopting a private cloud model enables government agencies to become agile and responsive towards changing business conditions.

II. Big Data Consensus

The IDC reports that approximately 2.5 exabytes of data is produced on a daily basis. Government agencies have a ton of data and having a human look at all of if is virtually impossible. The old model of data distribution greatly diminishes that data’s value to end-users, and ultimately to the taxpayer. A private cloud computing model is the answer to big data analysis. Tools that utilize artificial intelligence, machine learning and natural language processing can be used to quickly and accurately examine terabytes of data for anomalies and patterns; thus helping federal officials to make informed decisions. Additionally, once data has been made available via the cloud, its is readily accessible, meaning resource requests that previously took months to processes can be handled in a short time.

III. Data Sovereignty and Regional Concerns

When it comes to the cloud, ownership of data assets leads to more questions. Erosion of information asset ownership is undoubtedly a potential concern when resources are moved to any external system –public cloud included. There is an inherent difference between being responsible for data as a custodian and having complete ownership of it. Despite the fact that legal data ownership stays with the originating data owner, a potential area of concern with a public cloud deployment is that the cloud vendor may acquire both roles. The EU has been at the forefront to clear up the confusion and on the 25th of May 2018 will introduce a directive that establishes new rules to aid its citizens retain full control over personal data.

Another area of concern includes the complex legal, technical and governance issues that surround hosting government data in varying jurisdictions. Governments are known to like concrete boarders; but the cloud is global, it transcends physical spaces and boarders. Since the services exist globally, and users can interact and share data remotely; what states or municipalities are responsible for the data? Whose laws apply or don’t apply to any given exchange?

US government agencies have to adopt cloud strategies aimed at retaining sovereignty over government data. For any government agency seeking flexible and scalable data center solutions, a private cloud deployment can tie a range of integrated and end-to-end solutions that leverage cloud capabilities together. With a private cloud, the complexity of legal and government regulations are taken out of the equation. The data is maintained by the govt agency employees and is made available via internally-managed technology platforms or SaaS solutions like FileCloud. The ownership or jurisdiction of the data is no longer in question.

IV. You Deployed it, Now Secure It

Security is typically the top concern for federal IT managers when it comes to the migration of applications and data into the cloud. Governments understand that information is power, data is a crucial asset. Federal agencies represent a huge chunk of the globes largest data repositories, ranging from tax, employment, weather, agriculture and surveillance data among others. A recent study by MeriTalk revealed that only one in five of the Federal IT professionals surveyed believe that the security offered by cloud providers is sufficient for federal data. However, the same study also concluded that 64 percent of federal IT managers are more likely to place their cloud-based applications in a private cloud.

Why private cloud? Control. A private cloud deployment meets the required, strict security needs with more resource control and data isolation. Government organizations have to send and receive sensitive information while ensuring it’s only accessible to authorized users. Additionally, that have to maintain control of each user’s read and write rights to said data. Public cloud solutions simply don’t fit the bill for most govt agencies because the deployed applications and data have to remain completely under agency control. Private cloud solutions enable govt agencies to leverage their existing security infrastructure, while staying in control of their data. Since the deployment functions within your existing framework, the need to reinvent govt processes or security policies is eliminated.

Fed RAMP (Federal Risk and Authorization Management Program) standardizes security services and streamlines assessments so that any cloud vendor being considered by federal agencies is only evaluated once at a federal level. Safeguarding the security and integrity of data falls upon individual government organizations. A private cloud model gives organizations better performance and security control over the physical infrastructure that underlies its virtual servers.

V. Cross Agency Collaboration

Government agencies require a digital terrain through which to comfortably and confidently collaborate across, irrespective of agency or department. For example, different agencies may need to share compliance data, regulatory documents, case information or disaster response plans. For optimal collaboration efficacy, these resources have to be accessible to the workers within their respective organizations, to outside contractors, and the general public, when needed. Government agencies can leverage the security infrastructures and on-premises directories of a private cloud. Ensuring that sensitive data remains within the control of the organization, and only authorized persons have access to it. A private gov-cloud allows government organizations to collaborate both internally and across extended ecosystems in a compliant, secure and audit-able manner.

VI. Citizen Service Delivery

Most local, state and federal government agencies offer a variety of citizen services. Cloud computing helps in the delivery of those services and subsequently improves the lives of citizens on all those level. For example, enabling constituents to monitor water and energy consumption encourages them to be more vigilant about their usage. Quick and transparent access to service requests such as loans and application improves awareness and inclusion. A private cloud computing model is an ideal way of empowering and informing citizens.

In Closing

Cloud computing delineates an amazing opportunity to drastically revolutionize how government organizations manage, processes and share information. Although addressing all the challenges associated with cloud adoption can seem ominous, especially if a government organization lacks the expertise in cloud migration and deployment. Nevertheless, its clear that government agencies wish to perpetuate high standards of privacy, security and cost management, in their pursuit to transform operations into a flexible, dynamic environment. The most ideal solution for them is a private cloud.

Author: Gabriel Lando


FileCloud Empowers Compliant Enterprise File Collaboration for Law Enforcement Agencies

FileCloud, a cloud-agnostic Enterprise File Sharing and Sync (EFSS) platform, announced that it has partnered with CJIS Solutions, a leading CJIS Compliant Cloud, and Solutions provider for law enforcement, to deliver FileCloud’s EFSS platform as Law Share for law enforcement agencies and vendors.

“The majority of police departments across the nation are still using legacy systems for file sharing,” said Venkat Ramasamy, COO, FileCloud. “We are excited to partner with CJIS Solutions to host FileCloud’s flexible and adaptable solution to solve the complex enterprise file sharing and sync environment of today’s law enforcement—a highly regulated environment that requires customizable data controls and tools.”


In the partnership, CJIS Solutions will handle compliance requirements and service agreements with customers and Filecloud will deliver product features and updates. Benefits include:


  • Control: Gives complete control of where data is stored and processed. For CJIS compliance, entities must control data physically and logically.

  • Archiving: Enables entities to design their own archiving policies (e.g. all video, files can remain in-house on a storage system for 30 days and after 30 days move to cloud storage).

  • Performance: Viewing, saving and case management can be conducted in-house without taking up bandwidth or racking up transfer fees while leveraging the scalable cloud storage fabric for longer-term storage.  

  • Infrastructure Flexibility: Gives complete flexibility on where customers can store their data. Customers can enable hybrid cloud solutions to blend on-premise and off-site cloud storage.


  • Domain Knowledge: Owned and operated by law enforcement, CJIS Solutions applies its law enforcement specific domain knowledge and experience to solve police agencies’ unique use cases and requirements.


“Compliant enterprise governance is imperative for law enforcement officers,” said Chief Michael J. Coppola, President, and Founder, CJIS Solutions. “With secure, collaborative practices, agencies can share information in real-time, straight from the field to expedite digital evidence. Law Share also gives agencies the ability to answer discovery and public record requests compliantly and efficiently, saving taxpayers money and streamlining the traditionally demanding process. Partnering with FileCloud to deliver their offering as Law Share gives CJIS Solutions the platform needed to achieve a compliant product for agencies.”


This partnership offers the best in class solution for any law enforcement agency—the versatility of the FileCloud platform and the domain expertise of CJIS Solutions. For more information please visit or


About CJIS Solutions

CJIS Solutions is owned and operated by Law Enforcement Executives. CJIS has experience not only in the technology itself, but in the use, administration, policymaking, and management components of an agency as well. CJIS Solutions brings offerings such as hosted E-Mail, Mobile Device Management, Data Backup, hosted server environments, remote desktop solutions, and much more that will remove the dependency on expensive, in-house devices. For more information, please visit CJIS Solutions website.


About FileCloud

FileCloud is based in Austin, Texas, and is a developer of unified, secure enterprise file service platform that organizes enterprise data, enhances collaboration and productivity while providing ironclad data protection. FileCloud’s file sharing solutions offer powerful file sharing, sync, and mobile access capabilities on public, private and hybrid clouds. The company offers two products—Tonido for consumers, and FileCloud for businesses—that are used by millions of customers around the world, ranging from individuals to Global 2000 enterprises, educational institutions, government organizations, and managed service providers. For more information, visit

When Does AWS GovCloud Make Sense?


With GovCloud, AWS has successfully managed to revolutionize the game by providing an extensive and surefire way to not only implement but also manage business technology infrastructure. By providing services based on their own back-end technology infrastructure, which they have spent over a decade perfecting, AWS guarantees one of the most reliable, cost-efficient and scalable web infrastructures. GovCloud was launched in 2011 to satisfy stringent regulatory requirements for local, state and federal governments. Its efforts to meet regulatory standards and increase feature consistency between its public sector and commercial solutions has led to the addition of dozens of new services and nine new private-sector regions across the planet. This enables the IT departments within agencies to reap similar benefits from cloud computing enjoyed by all other AWS users, such as improved scalability and agility and greater alignment of costs.

Amazon explains that GovCloud tackles specific regulatory and compliance requirements such as the International Traffic in Arms Regulations (ITAR) that regulates how defense-related data is stored and managed. In order to guarantee that only designated individuals within the United States have access, GovCloud segregates the data both physically and logically. AWS GovCloud is not limited to the government agencies; the region is also available to vetted organizations and contractors who operate in regulated industries. For example, government contractors have to secure sensitive information.

When Does AWS GovCloud Make Sense?

I. High Availability Is Important to Mission Critical Applications

Building a highly available, reliable infrastructure on an on-premise data center is a costly endeavor. AWS offers services and infrastructure to build fault-tolerant, highly available systems. By migrating applications and services to AWS GovCloud, agencies not only benefit from the multiple features of cloud computing but also instantly reap improvements in the availability of their applications and services. With the right architecture, agencies get a production environment with a higher availability level, without any additional processes or complexity.

Some of the services GovCloud users can access to get this easy out-of-the-box redundancy, durability and availability include: EC2 coupled with auto-scaling – for scalable capacity computing; VPC – to provision private isolated AWS sections; Elastic Load Balancing (ELB) – to automatically distribute incoming application traffic across multiple EC2 instances; direct connect – to establish a private connection between an AWS GovCloud region and your data-center; and Elastic Beanstalk – to deploy and scale web apps and services.

II. Big Data Requires High-Performance Computing

User productivity and experience are key considerations, and both hinge on the performance of applications in the cloud. Government agencies typically amass huge sets of data that carry crucial insights. AWS GovCloud allows you to spin up large clusters of compute resources on-demand, while only paying for what you use and obtaining the business intelligence required to fulfill your missions and serve your citizens. Additionally, GovCloud avails low-cost and flexible IT resources, so you can quickly scale any big data application, including serverless computing, Internet of Things (IoT) processing, fraud detection, and data warehousing. You can also easily provision the right size and type of resources you require to power your big data analytics applications.

III. High Data Volume Means Higher Storage and Backup Needs

A major consideration when migrating to the cloud is secure, scalable storage. For government organizations, this need is amplified, not only because of the volume of data that needs to be stored, but also because of the sensitive nature of said data. AWS provides scalable capacity and direct access to durable and cost-effective cloud storage managed by U.S. persons, while satisfying all security requirements. GovCloud users have access to multiple storage options, ranging from high-performance object storage to file systems attached to an EC2 instance. AWS also offers a native scale-out shared file storage service, Amazon EFS, that gives users a file system interface and file system semantics. Amazon Glacier and S3 provides low-cost storage options for the long-term storage of huge data sets.

Customers can have the information stored in Redshift, Glacier, S3 and RDS automatically encrypted with a symmetric-key encryption standard that utilizes 256-bit encryption keys. Additionally, using very simple approaches, IT systems can be backed up and restored at a moment’s notice.

IV. Critical Applications Should Scale With User Demand

Predictable workloads may require reserved instances during spikes, and such payloads need on-demand resources. AWS utilizes advanced networking technology built for scalability, high availability, security and reduced costs. Using advanced features such as elastic load balancing and auto-scaling, GovCloud users can easily scale on demand. Auto-scaling enables government agencies to maintain application availability by dynamically scaling your EC2 capacity up or down depending on the specified conditions. Amazon Elastic Cloud Compute (EC2) provides re-sizable, secure compute capacity in the cloud. It is built to make web scale computing simpler, enabling users to efficiently and quickly scale capacity as computing requirements change.

In Closing

As the number of government organizations moving to the cloud continues to rise, these organizations will require a platform for compliance and risk management – a place where confidential, sensitive or even classified data and assets remain secure. GovCloud provides a quick way for government agencies to host and update cloud data and applications so that contractors and employees can focus on service delivery rather than managing server infrastructure.

Government organizations can take full advantage of GovCloud and all that it has to offer via content collaboration software. FileCloud on AWS GovCloud is an ideal solution for government agencies that want complete control and security of their files.
Click here to learn more about FileCloud on AWS GovCloud.


Author: Gabriel Lando

FileCloud Empowers Government Agencies with Customizable EFSS on AWS GovCloud (U.S.) Region

FileCloud, a cloud-agnostic Enterprise File Sharing and Sync platform, today announced availability on AWS GovCloud (U.S.) Region. FileCloud is one of the first full-featured enterprise file sharing and sync solutions available on AWS GovCloud (U.S.), offering advanced file sharing, synchronization across OSs and endpoint backup. With this new offering, customers will experience the control, flexibility and privacy of FileCloud, as well as the scalability, security and reliability of Amazon Web Services (AWS). This solution allows federal, state and city agencies to run their own customized file sharing, sync and backup solutions on AWS GovCloud (U.S.).

“Having FileCloud available on AWS GovCloud (U.S.) provides the control, flexibility, data separation and customization of FileCloud at the same time as the scalability and resiliency of AWS,” said Madhan Kanagavel, CEO of FileCloud. “With these solutions, government agencies can create their own enterprise file service platform that offers total control.”

Government agency and defense contractors are required to adhere to strict government regulations, including the International Traffic in Arms Regulations (ITAR) and the Federal Risk and Authorization Management Program (FedRAMP). AWS GovCloud (U.S.) is designed specifically for government agencies to meet these requirements.

By using FileCloud and AWS GovCloud (U.S.), agencies can create their own branded file sharing, sync and backup solution, customized with their logo and running under their URL. FileCloud on AWS GovCloud offers the required compliance and reliability and delivers options that allow customers to pick tailored cloud solutions. FileCloud is a cloud-agnostic solution that works on-premises or on the cloud.

“FileCloud allows us to set up a secure file service, on servers that meet our clients’ security requirements,” said Ryan Stevenson, Designer at defense contractor McCormmick Stevenson. “The easy-to-use interfaces and extensive support resources allowed us to customize who can access what files, inside or outside our organization.”

Try FileCloud for free!

GDPR – Top 10 Things That Organizations Must Do to Prepare

May 25, 2018 – that’s probably the biggest day of the decade for the universe of data on the Internet. On this date, Europe’s data protection rules –  European General Data Protection Regulation (GDPR) – becomes enforceable. In 2012, the initial conversations around GDPR began, followed by lengthy negotiations that ultimately culminated in the GDPR proposal. At the time of writing this guide (Sep 2017), most European businesses have either started making first moves towards becoming compliant with GDPR, or are all set to do so. Considering how GDPR will be a pretty stringent regulation with provisions for significant penalties and fines, it’s obvious how important a topic it has become for tech-powered businesses.

Now, every business uses technology to survive and thrive, and that’s why GDPR has relevance for most businesses. For any businessman, entrepreneur, enterprise IT leader, or IT consultant, GDPR is as urgent as it is critical. However, it’s pretty much like the Y2K problem in the fact that everybody is talking about it, without really knowing much about it.

Most companies are finding it hard to understand the implications of GDPR, and what they need to do to be compliant. Now, all businesses handle customer data, and that makes them subject to Data Protection Act (DPA) regulations. If your business already complies with DPA, the good news is that you already have the most important bases covered. Of course, you will need to understand GDPR and make sure you cover the missing bases and stay safe, secure, reliable, and compliant in the data game. Here are 10 things businesses need to do to be ready for GDPR.

Top 10 things that organizations should do to prepare and comply with GDPR

1.      Learn, gain awareness

It is important to ensure that key people and decision makers in your organization are well aware that the prevailing law is going to change to GDPR. A thorough impact analysis needs to be done for this, and any areas that can cause compliance issues under GDPR needs to be identified. It would be appropriate to start off by examining the risk register at your organization if one exists. GDPR implementation can have significant implications in terms of resources, particularly at complex and large organizations. Compliance could be a difficult ask if preparations are left until the last minute.

2.      Analyze information in hand

It is necessary to document what personal data is being held on hand, what was the source of the data, and who is it being shared with. It may be necessary for you to organize an organization-wide information audit. In some cases, you may only need to conduct an audit of specific business areas.

As per GDPR, there is a requirement to maintain records of all your activities related to data processing. The GDPR comes ready for a networked scenario. For instance, if you have shared incorrect personal data with another organization, you are required to inform the other organization about this so that it may fix its own records. This automatically requires you to know the personal data held by you, the source of the data and who it is being shared with. GDPR’s accountability principle requires organizations to be able to demonstrate their compliance with the principles of data protection imposed by the regulation.

3.      Privacy notices

It is important to review the privacy notices currently in place and put in a plan for making any required changes before GDPR implementation. When personal data is being collected, you currently need to provide specific sets of information such as information pertaining to your identity and how you propose to use that information. This is generally done with a privacy notice.

The GDPR requires you to provide some additional information in your privacy notices. This includes information such as the exact provision in the law that permits asking for that data and retention periods for the data. You are also required to specifically list that people have a right to complain to the ICO if they believe there is a problem with the way their data is being handled. The GDPR requires the information to be provided in the notices in easy to understand, concise and clear language.

4.      Individual rights

You should review your procedures to confirm that they cover all the individual rights set forth in the GDPR. These are the rights provided by the GDPR.

  • To be informed
  • Of access
  • To rectification
  • To erasure
  • To restrict processing
  • To data portability
  • To object
  • To not be subject to automated profiling and other such decision-making

This is an excellent time to review your procedures and ensure that you will be able to handle various types of user requests related to their rights. The right to data portability is new with the GDPR. It applies:

  • To personal data provided by an individual;
  • When processing is based on individual consent or to perform a contract; and
  • Where processing is being done by automated methods.

5.      Requests for Subject access

You would need to plan how to handle requests in a manner compliant with the new rules. Wherever needed, your procedures will need to be updated.

  • In most of the cases, you will not be allowed to charge people for complying with a request
  • Instead of the current period of 40 days, you will have only a month to execute compliance
  • You are permitted to charge for or refuse requests which are apparently excessive or unfounded
  • If a request is refused, you are required to mention the reason to the individual. You are also required to inform them that they have the right to judicial remedy and also to complain to the correct supervising authority. This has to be done, at the very latest, within a month.

6.      Consent

It is important to review how you record, seek and manage consent and if any changes are required. If they don’t meet the GDPR standard, existing consents need to be refreshed. Consent must be specific, freely given, informed, and not ambiguous. A positive opt-in is required and consent cannot be implied by inactivity, pre-ticked boxes or silence. The consent section has to be separated from the rest of the terms and conditions. Simple methods need to be provided for individuals to take back consent. The consent is to be verifiable. It is not required that the existing DPA consent have to be refreshed as you prepare for GDPR.

7.      Aspects related to children

It would be good if you start considering whether systems need to be put in place in order verify the ages of individuals and to get consent from parents or guardians for carrying out any data processing activity. GDPR brings in specific consent requirements for the personal data of children. If your company provides online services to children, you may need a guardian or parent’s consent so as to lawfully process the children’s personal data. As per GDPR, the minimum age at which a child can give her consent to this sort of processing is set to 16. In the UK, this may be lowered to 13.

8.      Aspects related to data breaches

You should ensure that you have the correct procedures necessary to investigate, report, and detect any breaches of personal data. The GDPR imposes a duty on all companies to report specific types of data breaches to the ICO, and in some situations, to individuals. ICO has to be notified of a breach if it is likely to impinge on the freedoms and rights of individuals such as damage to reputation, discrimination, financial loss, and loss of confidentiality. In most cases, you will also have to inform the concerned parties directly. Any failure to report a breach can cause a fine to be imposed apart from a fine for the breach by itself.

9.      Requirements related to privacy by design

The GDPR turns privacy by design into a concrete legal requirement under the umbrella of “data protection by design and by default.” In some situations, it also makes “Privacy Impact Assessments” into a mandatory requirement. The regulation defines Privacy Impact Assessments as “Data Protection Impact Assessments.”’ A DPIA is required whenever data processing has the potential to pose a high level of risk to individuals such as when:

  • New technology is being put in place
  • A profiling action is happening that can significantly affect people
  • Processing is happening on a large set of data

10.  Data protection officers

A specific individual needs to be designated to hold responsibility for data protection compliance. You must designate a data protection officer if:

  • You are a public authority (courts acting in normal capacity exempted)
  • You are an institution that carries out regular monitoring of individuals at scale
  • You are an institution that performs large-scale processing of special categories of data such as health records or criminal convictions

Many of GDPR’s important principles are the same as those defined in DPA; still, there are significant updates that companies will need to do in order to be on the right side of GDPR.

Author: Rahul Sharma


Guide to the UK General Data Protection Regulation (UK GDPR)



Right Data Storage For Government Organizations: Public vs. On-Premise

self host vs public

Largely fueled by the US Federal Cloud Computing Strategy, many governmental institutions and organizations have been gradually migrating to the cloud. The cloud has not only proven to be beneficial to businesses, but also governmental organizations as they seek to provide improved services to its citizens. This has seen the government invest immensely, which according to the IDC’s projected spending, is currently at $118.3 million on public cloud and $1.7 billion on private cloud.

By the year 2012, more than 27% of the state and local government organizations had already adopted the cloud. After the Federal Data Center Consolidation Initiative started gaining traction in 2011-2012, the number increased tremendously to more than 50% by 2014. Although it doesn’t strictly limit the organizations to the cloud, the initiative was enacted as a strategy to reduce government expenditure and improve the quality of services rendered to citizens through the consolidation of data centers. All the data warehouses were merged into single data centers, with some government agencies opting to move to the cloud and others relying on single consolidated in-house data centers. As a result, a lot of underutilized real estate was saved, expenditure reduced, and according to the MeriTalk report, “The FDCCI Big Squeeze”, government IT staff and resources were re-assigned to more critical tasks.

What is the right data storage For Government organizations? public vs. on-premise. Although both approaches have proven to be beneficial, it’s important to critically compare them. This post covers some of the considerations in selecting right data storage for government: Security, Scalability, Costs, Service Delivery.  Government institutions are unique and they will ultimately strategize more efficiently based on their respective data-storage plans.


In an era where cyber warfare is the most widely preferred method of engagement, government data centers face significant threats not just from domestic but also foreign hackers. One of the most recent major attacks was directed at the database. The site, which was initially attacked in 2013, fell victim to hackers again in July 2014, who targeted a test server. Although no private data was lost, the attack shook Americans by exposing the federal system’s vulnerabilities.
To prevent a recurrence, the government has been taking data security very seriously by leveraging solutions which strictly adhere to security compliance. Most of the least sensitive data is transmitted through and stored within the cloud, which is managed by dedicated teams of third party security experts. The most sensitive data on the other hand, is best stored in in-house data centers, where access is strictly controlled by security agents, and regulated through anti-theft software solutions. A good example is the expansive NSA Data Center in Utah, which is tightly secured and can only be accessed by high ranking NSA officials.

Service Delivery

The type of services a governmental organization deals in significantly dictates the most efficacious delivery and storage infrastructure. Services like healthcare and tax filing, which are largely citizen-centered, are best delivered through the cloud- using websites and applications which are available to all citizens. Through this strategy, the government has managed to reduce queues within its central stations and facilitate remote access of government resources, even to citizens in the diaspora. Additionally, moving apps to the cloud has helped cut the annual expenditure by 21% among the affected organizations.
Some government organizations on the other hand, deal with data and services which are agent-centered and are only open to government agents. They can therefore conveniently use in-house solutions and avoid deploying sensitive apps and data to the cloud. The NSA again, falls in this category since it uses dedicated servers (as opposed to the cloud) to store and exclusively distribute most of the data among its agents.


Setting up, deployment, operational and maintenance costs are regarded as the most critical factors for evaluating the suitability of solutions for all types of organizations- business and government. On average, organizations use about 5-20% of the total lifecycle capital (depending on the scope of the respective data centers) to set up the necessary infrastructure- facilities, networks, devices and servers.  50-80% of the operating costs are expenses which cater for maintenance, upgrades, real estate rates, etc. Hence on-premise implementations could have very different cost profile based on the complexity and scale. On the other hand,  the cost for cloud data storage is linear since the logistics are indirectly outsourced to managed service providers. In many cases, adding an on-premise solution could cost significantly less for a organization that has already successfully set up a stable and comprehensive in-house data center.


Both in-house and cloud solutions are scalable- but the former is a more complicated process compared to the latter. To upgrade or scale up in-house solutions, governmental organizations have to acquire all the requisite equipment to support their respective upgrades, plus labor to oversee and manage the entire process. Additionally, they are compelled to commit more real estate to expand their server rooms to accommodate larger data centers as the number of servers increase. Evidently, this process could be time consuming compared to the much simpler process of up-scaling public cloud storage. Public cloud servers are inherently elastic and can accommodate infinite scalability according to the growth of an organization.

Hybrid Storage Option

With 70% of organizations (including governmental organizations) reportedly using or evaluating hybrid solutions, it’s regarded the most effectual and convenient solution. It allows organizations to leverage both in-house and cloud storage solutions to take advantage of both sets of benefits. To use it in your government organization, you should first evaluate all your resources according to the benefits of both in-house and cloud data storage options. Your findings should be subsequently used to move only the most critical services or applications to the cloud and retain the rest within your data center for complete control.

Author: Davis Porter
Image Courtesy:, jscreationzs

Top Cloud Security Trends for Government Organizations

goverment security trend

According to a report by the Rand Corporation, the cyber black market is progressively growing- hackers are now more collaborative than ever and consistently use sophisticated strategies to target and infiltrate data centers. In the past, they were driven by sheer notoriety and malice to attack data centers and ultimately prove their maneuver skills to their peers. Unfortunately, the trend gradually changed, and hackers are now driven by warfare agendas and the increasingly developing black market, where they sell valuable information to the highest bidders.

Of course their biggest preys are government data centers, which are particularly targeted by cyber armies with agendas against their respective target nations. In fact, governments now face more potentially damaging risks from cyber warfare than the regular type of engagement- In the former, a single individual with just a computer could successfully launch an attack against major government cloud databases, cripple them, and cause significant socio-economic damages. One of the most recent attacks was directed at Iran’s nuclear centrifuges, where the attackers used the simple “Stuxnet” virus to harm more than 20% of their installations. Under the cover of different agendas, an Iranian hacking group also recently went on a cyber-attacking spree dubbed “Operation Cleaver”, which ultimately damaged essential government infrastructure in more than 16 countries.

According to experts, this is only the beginning. Through a research conducted by the Pew Research center, 61% of them believed that a well-planned large-scale cyber-attack will be successfully orchestrated before 2025, and consequently severely harm the nation’s security. With such threats looming, it is essential for the government to implement the most efficient developing security technologies into their cloud. Some of the current top trends include:

Improved Access Control

Many of the successful attacks sail through because of poor access controls in the targeted data centers. Although not a government corporation, Sony’s recent problems, which even drove the government to intercept, were caused largely due to careless password and username usage. To prevent such attacks, the government organizations are now opting for advanced authentication processes to access their cloud resources. In addition to the standard two-factor authentication which grants access after verifying IP and username, the organizations are now implementing biometrics and secondary devices verification in their access control architecture.

Sophisticated Encryption

To make data useless to hackers when they infiltrate data centers or spoof during transmission, government organizations have been encrypting their data. Unfortunately, this has proven ineffective to hackers who steal decryption keys or use sophisticated decryption algorithms to unfold and obtain data. To prevent future recurrences, government organizations are stepping up their data-at-rest and data-in-transit encryption systems.

Through the years, they have been using two factor encryption systems where cloud servers and endpoint user hold the encryption keys. This is gradually changing thanks to automated encryption control which get rid of the user factor in the equation. Instead of distributing encryption keys to the individual users, the systems use array-based encryption which fragments the data during storage and transmission. The meaningless fragments are transmitted individually and can only be fully defragmented into meaningful data if the server or endpoint device detects all the fragments. Therefore, hackers can only spoof on meaningless data fragments.

Digital Risk Officers

According the Gartner Security and Risk Management Summit of 2014, the year 2015 will see a proliferation of digital risk officers. In addition to tech officers, enterprises and government organizations will now hire digital risk officers to critically assess potential risks and strategize on cloud and data security.

This has been necessitated by continued expansion of the government digital footprint, whereby its organizations are now widely integrating their systems with employee BYOD to improve service delivery. As the network and infrastructure grows, so do the risks- which now require dedicated experts to prevent them from developing into successful attacks. With the trend only picking up in 2015, Gartner predicts it to exponentially grow over the next few years depending on the expanding networks of various organizations. In 2017, the adoption of DROs by government organizations is expected to be at 33%.

Multi-Layered Security Framework

Since the cloud systems are composed of various elements which face different threats, government organizations are protecting their data through tiered multi-layered security frameworks. For a hacker to gain access to any of the systems, he has to first go through a sophisticated security model composed of several detection, resistance, defense and tracking layers.

In addition to the network firewalls, the government is using virus-detection and anti-spyware on its servers and storage systems to comprehensively protect server operating systems, endpoint devices, file systems and databases and applications.

Big Data Security Analytics

“Without big data analytics, companies are blind and deaf, wandering out onto the web like deer on a freeway”- Geoffrey Moore, author of Crossing the Chasm, indicated as he emphasized the need of implementing big data analytics in all the relevant departments in an organization, especially the web.

Government organizations are directly adopting this principle by embedding critical big data security analytics in their security frameworks. This allows them to continuously monitor data movement, exchange and potential vulnerabilities which hackers and malware could capitalize on. Additionally, data that is generated is comprehensively analyzed to gather intelligence on internal and external threats, data exchange patterns, and deviations from normal data handling. Due to its efficacy in analyzing potential threats and blocking them, Gartner predicts that 40% of organizations (both government and non-government) will establish such systems in the next five years.

Although no strategy is regarded absolute and perfect, these current trends are expected to streamline the cloud sector and offer government organizations increased security compared to previous years. This, with time, is expected to significantly reduce the number of successful attacks orchestrated on governmental cloud resources.

 Author: Davis Porter
Image Courtesy: Stuart Miles,