Devin Partida, Author at Datamation https://www.datamation.com/author/dpartida/ Emerging Enterprise Tech Analysis and Products Mon, 08 May 2023 19:33:42 +0000 en-US hourly 1 https://wordpress.org/?v=6.2 5 Digital Transformation Examples https://www.datamation.com/trends/5-digital-transformation-examples/ Mon, 08 May 2023 19:33:42 +0000 https://www.datamation.com/?p=24107 The term “digital transformation” can be defined broadly, encompassing many ideas and largely tied to an organization’s specific goals. For example, a digital transformation might entail a business switching from paper forms to a cloud-based system for better recordkeeping, or developing an application to improve customer engagement options.

Typically, pursuing a digital transformation means prioritizing tools that enhance a company’s operations, profits, growth potential and more. Here are five examples of what a digital transformation could look like for a modern business to use as inspiration for progress at your own organization.

1. Digital Transformation Creates Strategic Advantage for a Healthcare System

Sometimes, a surprising revelation triggers the demand for a digital transformation. That was the case when leaders of a global and academic healthcare system realized how often people used mobile devices to get information about available health-related services. Data showed 40% of website visitors used mobile devices to get information about the system’s offerings.

This healthcare system was already successful, achieving $5.2 billion in revenue and attracting 30,000 employees and more than 4,000 physicians. However, one of the keys to becoming and remaining competitive is recognizing the need for change and responding accordingly. That meant relying on next-generation digital technologies to improve patient engagement and boost brand value.

Although the healthcare system already had a mobile app, customers wanted more than it offered. Early steps in the digital transformation process involved viewing offerings from patient and caregiver perspectives in multiple contexts. Officials used that information to discuss the characteristics of optimal digital experiences and engaged a service provider to develop a new app.

Next, the client narrowed down features for the first release of the mobile app. The service provider mobilized tech specialists to meet those demands, and ultimately these digital transformation plans created a fielded unified experience application for the healthcare client just six months after initial planning.

Within three months, the application had a 64% month-over-month adoption rate across a target pool of 500,000 patients. Plus, the healthcare system expects $10 million in total savings due to associated physician efficiency gains and streamlined procedures in other clinical and back-office operations. Since the app allows co-pay authorizations when patients arrive, the client may see an additional $10 million from increased collections.

2. A Data-Centric Plan Helps a Multinational Food and Beverage Brand Excel

Nestlé has over 150 years of history and thousands of brands under its company umbrella. Executives continually look for practical and proven ways to enhance operations, and that often means being open to using data and adopting technologies.

The company focused on maintaining privacy, connecting with consumers, and pursuing ongoing experimentation during a recent digital transformation, according to Global Chief Marketing Officer Aude Gandon.

One component of that was developing a future-proof first-party strategy emphasizing data safeguarding and privacy, some aspects of which included using consent-mode features within Google Analytics and developing a global advertising technology roadmap.

Ensuring the digital transformation helps Nestlé connect with consumers means reaching 400 million customers with the company’s first-party database by 2025. That information will allow brands under the company’s umbrella to extract valuable insights that improve competitiveness. One Nestlé company used first-party data to achieve a 25% boost in ad spend return by improving connectedness with customers during seasonal events.

Finally, the part of the digital transformation plan focusing on ongoing experimentation heavily relies on cloud computing. In one project related to a coffee brand in Thailand, employees sent large volumes of data from past campaigns to Google Cloud. They then used machine learning algorithms to predict the most appropriate creative messages to show to specific YouTube audiences. This tactic caused a 17% increase in cost-per-view metrics and a 12% lift in ad recall.

These examples show why aligning long-term plans with certain focal points is often useful. Otherwise, it could become too easy to get off track with the digital transformation. That’s especially likely to happen if numerous decision-makers are weighing in with different opinions and not agreeing about the best ways forward.

3. Digital Supply Chain Planning Improves Pandemic Coping

Many digital transformations involve exploring how new technologies help companies reach their goals. For example, some business leaders are examining how Non-Fungible tokens (NFTs) could strengthen their supply chains. Improved supply chain visibility is a priority for many leaders, regardless of industry, and some experts believe NFTs could track parts from various locations, allowing better delay forecasting.

In one case, DuPont utilized digital supply chain planning through scenario modeling that fostered better preparedness. The company’s supply chain experts scrambled to cope with the global market’s unpredictability in early 2020 as the COVID-19 pandemic worsened. They found the systems they’d previously used no longer met needs in the unprecedented circumstances. Leaders decided DuPont needed a customized digital platform.

The resulting creation was a digital tool that allowed people to plan for what-if scenarios that allowed for better business decisions, even in an uncertain environment. The system uses a custom-built algorithm and open-source platform that enable people to run multiple potential scenarios at once within minutes. That saves time and provides trusted results.

Users can also get financial projections, capacity planning, and inventory planning for up to two years into the future. That makes it easier for supply chain executives to make the choices they need to get products to those needing them. This digital platform supports distributing and manufacturing of more than 1,000 products that reach people worldwide. It also allows future scenario testing for nine supply chain categories across 75 locations.

DuPont’s supply chain planners run more than 20 scenarios per month. The company also has over two dozen people trained to use the tool.

4. Electronics Brand Aims for an Omnichannel Approach

The digital environments in today’s business environments are always changing. That’s because improvements and advancements continually happen, pushing leaders to evolve their companies.

In one case, Currys, a market-leading brand in the United Kingdom’s tech retail market, pursued a transformation that would turn it into a digital-first omnichannel retailer. Leaders already knew that 60% of the company’s customers preferred to shop across multiple channels, so the push to suit shoppers’ wants made sense.

Executives had a three-pillar plan for making improvements:

  • Ensuring an easy shopping experience
  • Providing a connected customer journey
  • Creating capable and committed employees

Using a Customer Relationship Management (CRM) platform allowed for meeting all those goals. It showed a 360-degree view of marketing, sales, and service interactions associated with every customer. Company decision-makers also used specialty software to update legacy infrastructure and cloud-based tools to give people the information they needed from any location.

These improvements created digital enhancements that affected online and in-store shopping experiences. They also meant employees could focus on building lasting customer relationships rather than merely serving people during single purchases and never attempting to lengthen the relationship.

Additionally, the digital transformation gave customers more personalized experiences, providing them with relevant prices and other product information to reduce friction. Whether people want to buy a new TV or a kitchen appliance, they’ll appreciate having need-to-know information that guides their interactions.

5. Cloud Computing Helps the University of Bristol Reach More Students

One of the primary goals of a digital transformation is to unlock more opportunities for a respective organization. Such was the case when University of Bristol officials wanted to branch out into remote learning. That would provide educational options for more people than the approximately 28,000 students per year who take in-person classes annually.

University executives chose a cloud technology solution and realized that moving into the realm of online courses would bring multiple benefits. For example, it would open new commercial appeal for the higher education institution while giving learners the freedom and flexibility to attend regardless of their locations or backgrounds.

Leaders also believe this digital transformation will improve its academic research arm. The University of Bristol counts international meteorological bodies, pharmaceutical researchers, and health care organizations as partners. All those parties have different requirements, but partnering with academic institutions allows researchers to take advantage of computing and storage resources that further their projects.

The university’s leaders knew their digital transformation would span multiple years. However, they did the smart thing from the start and worked with technology experts that could advise them every step of the way. These parties assisted in defining a target operating model and teaching the university’s employees the new skills they’d need to succeed with the cloud-based technology.

Much of this tech support happened remotely, although it went smoothly. That emphasizes the provider’s knowledge in helping clients digitally transform without the potential restrictions of geographical boundaries.

Digital Transformations Take Many Forms

These five examples prove there’s no single way to enact a digital transformation within a company. However, businesses are most likely to get the best results when they take the time to determine what would most severely limit future growth and success. Paying attention to those issues could provide decision-makers with the evidence they need that a digital transformation must happen soon. They’ll also better understand which problems to tackle first when making digital improvements.

]]>
Trends in Low-Code/No-Code https://www.datamation.com/trends/trends-in-low-code-no-code/ Mon, 08 May 2023 13:53:04 +0000 https://www.datamation.com/?p=24094 Coding is an in-demand skill that usually requires highly specialized knowledge. However, the rapid rise of Low-Code/No-Code (LC/NC) platforms has allowed people to engage in web and app development in a new way. People can drag and drop components in an interface and link them to create applications.

Here are some important trends about the LC/NC landscape and how people feel about it.

1. People Viewing Low-Code Solutions as Core Technologies

Not so long ago, many tech decision-makers saw LC/NC tools as niche products that grabbed their attention but weren’t necessarily critical for day-to-day business operations. However, that’s starting to change, particularly as many companies had to evolve due to challenges brought by the COVID-19 pandemic.

A 2022 survey from Low-Code platform provider Mendix found 94% of companies across various industries used Low-Code solutions in 2022, up from 77% in 2021. Moreover, 69% of respondents saw such offerings as crisis technologies during the pandemic but now view them as core to their business models.

Additionally, half of the respondents perceive Low-Code products as filling gaps in their IT departments, while 43% see them as able to assist with production engineering needs. Another notable takeaway was that four in 10 respondents now use Low-Code platforms for mission-critical applications. In fact, it’s estimated that 63% of app development activity will be done through Low-Code platforms.

Company leaders also find plenty of ways these platforms can help them. The Mendix survey showed 63% used such tools to address problems with logistics, the supply chain, and transportation. About 32% of retailers said Low-Code tools helped them offer curbside shopping pickups. Additionally, half of public sector respondents mentioned improved planning and management of resources and enhanced service access among the benefits.

However, such efforts sometimes come with issues that are not directly related to Low-Code products. For example, one-third of respondents felt frustrated by their company’s legacy systems. That’s why 39% have required proof that Low-Code offerings will integrate with them. That’s smart information to ask for since Low-Code products are relatively new. Getting the assurance of successful integration with legacy systems avoids surprises.

Indeed, some people have yet to try Low-Code solutions. However, this study shows adoption rates are climbing. When that happens, individuals who previously felt unsure will become more confident about exploring the possibilities.

2. LC/NC Providing an Option Beyond Hiring Developers

Since developers are in high demand, many company leaders must devote significant resources to hiring them. That’s often easier said than done, especially if there are relatively few developers in the job market or those within it have plenty of choices regarding where to work.

A 2023 study gave a closer look at the job market for developers and those who want to hire them. One finding was that 53% of developers consider salary the most important factor of a potential job. Another 38% of respondents mentioned having a good work-life balance, and 28% wanted the option to work remotely.

Another finding was that 52% of developers plan to leave their jobs within the next year. Among that group, 67% said the desire to get a higher salary was the main reason behind that decision. That finding suggests managers and human resources professionals cannot merely assume they’ll be able to retain developers after hiring them. These workers know they’re in demand, so they can afford to be picky about finding and staying at the most suitable workplaces.

On the recruitment side, 23% of tech recruiters said they plan to hire at least 50 developers this year. About 42% cited developer retention as their top priority for 2023, and 46% of recruitment professionals said they’d have bigger budgets this year than last. However, it’s not a given that all recruits will find and attract all the developers they need. This is where LC/NC platforms will prove particularly useful.

They won’t eliminate the need to hire developers, but an LC/NC tool could meet business needs and fill gaps while the hiring process is ongoing. Companies can become more nimble and able to respond to marketplace changes faster than they otherwise might.

3. An Appealing Option for Small Businesses

People who own or operate small businesses often face additional challenges related to resource usage and being able to pursue growth like larger companies can. However, LC/NC platforms could change that, and many analysts who have examined the matter believe they will.

This is not the first time coding has gone through a major change. Object-oriented programming arrived in the 1980s and allowed people to design programs with objects. Similarly, the 1990s and 2000s necessitated using different types of code to meet emerging needs. Companies of all sizes must adapt to stay competitive, but it’s often more challenging for smaller enterprises to make those changes.

An Accenture report clarified why LC/NC tools are vital for helping small to medium-sized businesses (SMBs) adapt to the changing landscape and harness all their tech offerings. In the opening part of the document, the authors mention the e-commerce platform Shopify and how it was instrumental in enabling companies to keep operating once the COVID-19 pandemic closed many physical stores. They believe LC/NC tools will have an equal or bigger impact on small and medium businesses.

A statistic cited in the report mentioned that 70% of small businesses are ramping up their digitization efforts worldwide. Low-Code/No-Code tools are instrumental in allowing that to happen. Another finding was that one in five small and mid-size businesses (SMBs) began searching for LC/NC platforms because of difficulties finding digitally fluent workers.

Moreover, 47% of respondents believed enterprise-level IT solutions don’t meet their needs because those providing them don’t understand the associated challenges. They said the shift toward LC/NC among SMBs illustrates that issue. When offerings meant for larger companies fail to fill gaps, people will look elsewhere for alternative solutions.

4. Turning Shadow IT Into an Asset With LC/NC

Shadow IT occurs when people use IT products without explicit workplace approval. Many tech professionals view it as a major problem. Consider how a 2022 study revealed that 69% of tech executives view shadow IT as a primary concern related to adopting cloud or Software-as-a-Service tools. Another 52% of respondents said individual employees purchase apps for use at work without the IT department’s knowledge.

However, some advocate turning shadow IT into an asset with LC/NC apps. A McKinsey report explained how Low-Code/No-Code products could allow organizations to increase innovation and speed when business and IT teams collaborate.

The report detailed three specific ways to achieve those aims:

  • Using enterprise-grade LC/NC platforms to customize and expand a product’s out-of-the-box capabilities
  • Augmenting existing products to give them new features and capabilities
  • Prototyping new ideas to establish use cases for new business applications

Many people use shadow IT products because the approved offerings don’t meet all their needs. These individuals are frequently unaware that they’re breaking any rules at their organizations and are merely trying to keep their workflows productive.

IT decision-makers should get feedback about any shortcomings associated with the approved products for employees to use. They could use LC/NC platforms to address those weak points with new products or options that alter what existing tools can do. After all, Low-Code and No-Code products typically shorten the overall development cycle, making workforces better equipped faster than traditional methods allow.

5. Company Representatives Discussing LC/NC More Often

Low-Code and No-Code technologies have a much better chance of succeeding when business leaders understand the advantages and are open to using them. A 2023 report showed a modest but notable rise in LC/NC discussions at companies.

The data indicated an 18% year-on-year increase in such talks in 2022 versus 2021. That’s important because discussions are often critical to help people with corporate buying power determine if they want to pursue certain possibilities.

Numerous software company representatives have pondered using LC/NC tools to reach internal aims faster than they could with conventional coding. Most business leaders know the importance of watching for changes in their respective industries and responding promptly. Otherwise, people who wait too long to react could face challenges catching up with their peers.

It also helps when companies have specific individuals who champion LC/NC products. Humans naturally resist change, even when they can see some of the advantages. However, when someone they know, trust, and respect encourages them to be open to Low-Code and No-Code products, it becomes more likely they’ll eventually embrace them.

These Low-Code/No-Code Trends Matter

Low-Code and No-Code platforms are still evolving, along with people’s opinions of them. Seeing how things play out in the coming months and years will be interesting. In any case, the five trends here are important to watch because they highlight the current state of things. Even if things change later, these ongoing patterns in LC/NC adoption and usage will likely shape what’s ahead.

]]>
The Future of Low Code No Code https://www.datamation.com/trends/the-future-of-low-code-no-code/ Fri, 05 May 2023 20:30:25 +0000 https://www.datamation.com/?p=24091 Low-Code/No-Code (LC/NC) platforms are revolutionizing the software development industry. Today, anyone can use them to create their own app, tool, or website without existing programming knowledge. How will Low-Code/No-Code platforms evolve in the coming years, and how are they forcing the industry itself to evolve?

Evolving Applications of Low-Code/No-Code

The LC/NC market is expected to grow 20% in 2023 alone and reach an estimated value of $26.9 billion. This technology has gained popularity in recent years as a means of closing skill gaps and making app and web development more efficient. However, it still lacks the flexibility of custom apps designed more traditionally by skilled developers.

Current applications for LC/NC development sit somewhere between off-the-shelf and custom solutions. How will these applications change in the next few years? Here are some of the areas in which developers can expect to see change.

Robotic Process Automation (RPA)

Robotic Process Automation is one of today’s most common applications for Low-Code/No-Code platforms. LC/NC is a great fit for RPA because it usually requires simplifying something that already exists, such as automating a specific workflow.

Low-Code/No-Code developers already know what they need from an app they want to build, so they can shortcut the process without significant User Experience (UX) design. The LC/NC approaches give new developers the tools to build and integrate a straightforward RPA app in the minimum turnaround time possible.

In the future, LC/NC platforms may include more advanced RPA capabilities, and may be able to integrate data from more sources or handle more tasks in a single app. This particular use case may lean more toward No-Code platforms, since automation will soon be necessary for more jobs. As more people without coding experience will seek the ability to use automation, the demand for RPA-specific No-Code platforms will increase.

Simple Web and App Development

The main apps and tools for which Low-Code/No-Code approaches are currently ideal are typically simple in scope and limited in distribution. Most often, a user develops an app solely for in-house use, for their own personal use, or for a one-time event or conference.

For example, Low-Code/No-Code is commonly used for replacing legacy systems. Digital transformation spending is expected to total $3.4 trillion worldwide by 2026. Businesses must evolve their operations and technology to keep up, but that can be difficult without a large development team. Low-Code/No-Code platforms allow companies to upgrade technologies and workflows without in-house developers.

Low-Code/No-Code development platforms aren’t intended for large-scale applications, nor are they ideal for supporting hundreds of users or managing massive quantities of data. In the future, this could change as the technology becomes more capable. For example, Artificial Intelligence (AI) could make it easier to create complex apps without requiring coding knowledge.

Challenges and Innovations in Low-Code/No-Code

How will the capabilities of Low-Code/No-Code platforms evolve in the future? What new applications are emerging? They will increasingly shift toward zero necessary IT involvement in the development process as AI makes it possible for nearly anyone to create original, customized code.

Generative AI-Powered Coding

Generative AI is changing the game in app and web development. Platforms like ChatGPT are opening the door for anyone to try developing their own app or website with zero prior experience. Users can type in a text prompt explaining what they want, and ChatGPT will do its best to generate code that fits the bill. It can also help debug code that users copy and paste into the prompt window.

Of course, platforms like ChatGPT are not foolproof. They do make mistakes, and users have found flaws and gaps in AI-generated code. As of 2023, ChatGPT-4 excels with small, specific chunks of code but breaks down when asked to write an entire application. It can deliver customized code, but only piecemeal. Developers still need to know what’s required and how it fits with the rest of their apps.

Platforms like ChatGPT could evolve into full-scale app development tools in the future. In many ways, AI is the ultimate Low-Code/No-Code platform. Users type in what they want the code to do and let the AI do the rest. Businesses will likely be able to function with small teams of developers who verify and implement it.

Greater Emphasis on Cybersecurity

One of the pitfalls of today’s Low-Code/No-Code platforms is a minimal ability to customize security features. The lack of visibility into the coding going on behind the scenes simplifies development but blinds developers to potential security risks. Additionally, people with no coding knowledge or experience using LC/NC approaches  may not be aware of important security features they should have or red flags to watch out for.

In the future, Low-Code/No-Code platforms will see more emphasis on cybersecurity. For example, the Online Worldwide Application Security Project (OWASP) has developed a framework of 10 key security protocols for Low-Code/No-Code apps. Developers can use it to learn about important security risks and features and how to address them in their development process.

The security options in Low-Code/No-Code platforms themselves will also grow in the years ahead. The global cost of cybercrime is expected to hit $11.5 trillion in 2023 and more than double that by 2027. There will be more demand for advanced security features as security threats grow. For example, developers might begin including AI threat-monitoring tools.

Clearer Intellectual Property Standards

Intellectual Property rights are a growing concern in coding and development, especially since AI can write functional code. When anyone can automate coding, who is really writing it? Who is the developer of new Low-Code/No-Code apps, and who has the IP rights to these programs and any profits made?

These questions must be resolved as Low-Code/No-Code platforms gain in popularity, particularly in the context of growing geopolitical complications surrounding IP rights. For instance, the war in Ukraine led Russia to implement a 0% license fee on IP content from “unfriendly countries” like the U.S. and European nations.

Code and apps can be subject to IP laws, not just content such as books and movies. Low-Code/No-Code platforms may soon be able to develop apps on the same level of customization and precision a professional developer could deliver, and the industry will need to decide who has the IP rights to these new apps—the people using the platforms, or those who designed them.

How Will Low-Code/No-Code Impact Developers?

Low-Code/No-Code technology’s role in the software development industry is also evolving. Everyone is wondering what the future holds for professional software developers today. The combination of AI and Low-Code/No-Code platforms leads many to wonder if they will become obsolete. While this will not happen anytime soon, the developer role is shifting.

Low-Code/No-Code platforms and AI like ChatGPT are tools, like any other technology. They can help developers do their jobs more efficiently and easily but cannot replace the expertise people can provide.

Resolving the skills shortage is one specific area where Low-Code/No-Code platforms will help developers. Coders and programmers are in high demand in all areas of computer science today.

For example, the shortage of cybersecurity professionals leaves many businesses ill-equipped to handle rising cybercrime rates. Similarly, over 37% of recruiters report struggling to find enough developers with the necessary skills for their businesses’ needs. However, young people continue to show a strong interest in computer science, indicating a growing talent pool.

Demand for software development skills continues to grow faster than the available talent pool can keep up with. Low-Code/No-Code platforms will help businesses fill those shortages. Smaller teams of developers can use them to work more efficiently and operate at the same level as a larger group.

Similarly, developers may not need to do much manual coding in the future. Their roles may shift toward designing, testing, and maintaining apps. Meanwhile, Low-Code/No-Code platforms and AI will do the bulk of the actual code-writing process. As a result, developers will be able to roll out apps faster and with less budget required.

Low-Code/No-Code Is Innovating Software Development

Low-Code/No-Code software development platforms are transforming how new apps, tools, and websites are created. Now anyone can get into software development, regardless of prior coding experience.

Low-Code/No-Code platforms will become more capable in the years ahead thanks to the advanced capabilities of AI models like ChatGPT. IP rights and cybersecurity will become important concerns as adoption grows. Professional developers will remain vital to the industry for the foreseeable future, although their roles will evolve to adapt to Low-Code/No-Code processes.

]]>
Top Open Source Companies 2023 https://www.datamation.com/open-source/35-top-open-source-companies/ Mon, 20 Mar 2023 05:00:00 +0000 http://datamation.com/2017/09/21/35-top-open-source-companies/ Open source companies are, for the purpose of the list below, defined as companies that make significant use of open source software. As you’ll see in the list, many of the companies also use proprietary software. Yet still, the companies below can be viewed as major consumers/producers of open source software.

Enterprise open-source software accounts for 29% of all software in use today, and experts predict it will make up 34% by 2024. As this program category grows, here are 20 companies developing it to keep an eye on in 2023.

For more information, also see: Open Source Software: Top Sites

1. Amazon Web Services

Amazon Web Services (AWS) is one of tech’s “Big Five” and a leader in the cloud computing industry, but it’s also a significant contributor to open-source software. The tech giant’s employees have spearheaded over 1,200 projects on GitHub. Many of AWS’s most popular tools and services are also based on open-source projects.

Product Portfolio

One of AWS’s most popular open-source products is its Cloud Development Kit (CDK). AWS CDK lets users build custom cloud applications using familiar programming languages and preconfigured components, aiding faster development.

AWS also has a suite of other services based on popular open-source projects, like the Linux-based Bottlerocket OS and its Elastic Kubernetes Service. Other products of note include Firecracker, OpenSearch and AWS Amplify, all of which help build and manage websites or other cloud applications.

2. The Apache Software Foundation

AWS is a noteworthy corporate open-source contributor, but many leaders in this area are non-profit organizations. The Apache Software Foundation (ASF) — the world’s largest open-source foundation — is a prime example. ASF includes tens of thousands of members, but the organization itself is responsible for more than 350 open-source projects.

Product Portfolio

Apache Spark is one of ASF’s most popular releases. The multi-language data processing tool is a staple for data science and machine-learning processes seeing use everywhere from NASA to eBay. ASF is also responsible for Hadoop, a big data software library, Kafka, which powers real-time data applications like Uber’s driver matching, and CloudStack, a virtual machine deployment platform.

3. Box

Box is a newer but still noteworthy open-source software producer. The cloud services company specializes in security and workflow efficiency, with customers ranging from the U.S. Air Force to Morgan Stanley. As an open-source developer, Box has released more than a dozen open programs and software development kits.

Product Portfolio

Among Box’s most impressive open-source contributions is Spout, a PHP library that can read and write large CSV and XLSX files while keeping memory usage below 10 megabytes. Box also offers many plugins and prebuilt UI components to use on its platform. These features make the Box environment easier to use and configure to specific use cases, fueling its popularity among many audiences.

4. Builder.io

Another relative newcomer to the open-source software space is Builder.io. Builder is a visual content management system offering drag-and-drop functionality and API-based infrastructure to integrate with other apps. In addition to its central platform, the company has released five open-source projects and 52 repositories.

Product Portfolio

The Builder.io platform is closed-source, but its built-in components like images, text, and columns are open. Its developer tools are also open-source. Builder’s offerings include Qwik, a framework for building instant-on apps, Partytown, which relocates resource-intensive scripts to improve performance, and Mitosis, a universal components compiler.

5. Cloud Native Computing Foundation

Many people associate open-source software with smaller companies like Builder and Box, but large developers like the Cloud Native Computing Foundation (CNCF) account for a considerable portion of the industry. CNCF focuses on cloud computing apps and resources, with more than 150 projects and 187,000 contributors to its name.

Product Portfolio

CNCF’s most recognizable project is Kubernetes. The container orchestration system sees use in 61% of global organizations today and has quickly become an unofficial standard for containerization.

The Foundation also manages Prometheus, a data monitoring and visualization platform, Argo, a Kubernetes-based continuous integration and delivery engine, and Rook, a cloud-native storage solution. Dozens more projects populate CNCF’s portfolio across various stages of development.

6. Databricks

Databricks is a smaller but fast-growing open-source software company to watch in 2023. The organization centers around what it calls a “Lakehouse Platform,” which combines aspects of a data warehouse and a data lake. The platform has seen use from the likes of the FDA and AT&T, and more importantly for this list, is based on open software and standards.

Product Portfolio

The Lakehouse Platform itself is open-source, as are its first-party plugins. Chief among these is Delta Lake, an open-format storage layer that lets you build a lakehouse architecture over existing storage systems like AWS S3.

Another open-source project — Delta Sharing — claims to be the first open protocol for secure data sharing. This lets Delta Lake users or other cloud adopters transfer and share files between platforms without jeopardizing security.

7. Docker

Docker is one of the most familiar names in open-source software for many developers. The container-based platform-as-a-service ranked as the most beloved tool in Stack Overflow’s 2022 Developer Survey. It introduced what’s now the standard for containerization and remains a leading software development platform.

Product Portfolio

Docker’s primary platform — the Docker Engine — is open-source. The engine uses APIs to help users build and containerize applications and comes in several packages, ranging from personal use to enterprise-level platforms.

In addition to its main engine, Docker also offers at least 12 other open-source projects with multiple Docker-adjacent features. Compose helps run multi-container apps, and Moby provides specialized tools to streamline the assembly of unique containerized applications.

8. Google

As another one of tech’s Big Five, Google needs no introduction in most circles. What most web users may not realize, however, is that the search giant is one of the most significant contributors to open-source software. It was the original developer of Kubernetes before the CNCF took over and is responsible for Android, the world’s most popular mobile operating system, which is also open-source.

Product Portfolio

Android is Google’s most recognizable open-source project, but it’s far from the only one. Chromium — the browser technology powering Google Chrome — is also open-source, as is Chromium OS, Google’s web-based operating system. Google has also developed Go, an increasingly popular programming language, and TensorFlow, an open machine learning platform.

9. H2O.ai

While TensorFlow may have Google’s name recognition behind it, it’s not the only open-source machine learning platform. Another one to note for 2023 is H2O.ai, which boasts more than 18,000 organizations as its customers. The platform supports multiple programming languages and offers several automation features to democratize and streamline machine learning development.

Product Portfolio

The company’s base product — H2O — is far from the only open-source project in its portfolio. H2O also develops Sparkling Water, an Apache Spark plugin for H2O, Driverless AI, which automates steps like model building, and H2O Wave, which offers real-time dashboards and web apps for AI developers.

10. HashiCorp

HashiCorp is another wholly open-source-focused organization. The company — which develops cloud computing tools — has committed to keeping its core technologies open. That model is paying off, as HashiCorp sees more than 250 million open-source downloads annually.

Product Portfolio

HashiCorp offers eight open-source tools today — Packer, Vagrant, Terraform, Consul, Boundary, Vault, Nomad, and Waypoint. Packer and Terraform automate cloud infrastructure building and management projects, while Nomad, Waypoint, and Vagrant focus on cloud app development and implementation. Vault and Boundary provide security controls, and Consul automates service networking across multiple cloud environments.

11. IBM

IBM is one of the longest-standing companies in the technology industry, and it’s been a leader in open-source software for a similarly lengthy stretch. The corporation has more than 25 years of experience developing open tools and over than 3,000 employees actively contributing to these projects.

Product Portfolio

As such a large organization, IBM has an impressive suite of open-source software products. One of the most notable is Machine Learning Exchange, which offers free, deployable deep learning models, democratizing IBM’s leadership in the machine learning space.

Other IBM open-source projects include Incident Accuracy Reporting System, which automates police incident reporting processes, and Data Asset eXchange, where enterprise users share data science libraries and tools.

12. Intel

Intel is another recognizable corporate name with significant open-source contributions. While you likely know the company as a chip developer, Intel’s hundreds of open-source projects cover various applications, from AI development to Internet of Things management.

Product Portfolio

Intel’s most popular open-source product is Open Federated Learning, a Python-based software development framework popular among data scientists and game developers. Another widely adopted project is QEMU, a machine virtualization platform. Intel also produces many Linux tools, including the Linux kernel at the heart of many Chromebooks, and specialized security systems for the OS.

13. Microsoft

Despite building its name on proprietary software, Microsoft is the largest contributor to open-source projects in the world. The company extensively uses open tools in its development, and its massive range of company-and-employee-made open products covers multiple use cases.

Product Portfolio

One of Microsoft’s most significant open-source projects is Azure SDK, a set of libraries to help developers leverage the Azure cloud product suite. Accessibility Insights is a similar tool, providing monitoring and checking features to find and fix accessibility issues in Windows, Android, and web apps.

Other noteworthy projects include Kubernetes Event-driven Autoscaling, an application autoscaling tool for Kubernetes, and Open Education Analytics, which aids collaboration between educational organizations for data and AI projects.

14. Meta

Another household name, Meta is likewise a considerable contributor to open-source software. Facebook and Instagram — the products Meta is most well-known for — are closed, but the technology titan has more than 600 open-source projects available today.

Product Portfolio

PyTorch is Meta’s leading open product. This machine-learning framework builds on the Torch library and offers a user-friendly way to develop and train intelligent models, including frictionless cloud development and scaling.

Meta also develops React, a Java library for building UIs, and Docusaurus, which simplifies the website development and optimization process. As you might expect from Meta’s virtual reality products like the Quest, the company also produces open VR projects, including image synthesis and physics tools.

15. Oracle

Oracle is another software giant with many open-source projects under its belt. While the company’s commercial offerings focus on cloud computing apps and infrastructure, its open-source tools cover a wider range of applications. On top of creating its own components, the organization’s employees contribute to Linux and Kubernetes, among other popular open-source projects.

Product Portfolio

The most notable entry in Oracle’s open-source portfolio is Java. Java is one of the best programming languages for beginners, thanks to its extensive use, relative simplicity, and large support community. Oracle also runs a library of images and configurations for Docker, as well as a Linux distro, the popular MySQL database, and Tribuo, which helps develop machine learning models in Java.

16. OpenAI

OpenAI may have a different history and industry experience than corporations like Oracle and Microsoft, but it has skyrocketed into notoriety. The AI developer is behind ChatGPT and Dall-E, which more than 3 million users today apply to their workflows and projects. While not every tool from OpenAI is open-source, several of its underlying technologies are.

Product Portfolio

GPT-3 — the underlying natural language processing model behind ChatGPT — is OpenAI’s most notable open-source contribution. The tool can manage generative text tasks, summarization, text parsing, and translation. Another open-source product worth noting from OpenAI is Point-E. Point-E generates 3D models from text, helping streamline illustration and virtual model development.

17. Red Hat

Red Hat has been a leader in open-source software for years, and that won’t likely change in 2023. The largest open-source software company in the world specializes in cloud computing tools, though it also has many Linux and Kubernetes technologies.

Product Portfolio

All of Red Hat’s products are open-source, setting it apart from many groups on this list that also produce closed commercial tools. Ansible is one of the most significant of these, a platform for automating IT tasks to support DevOps. Other projects of note include the Red Hat OpenStack Platform, which virtualizes hardware and organizes these packages in the cloud, and OpenShift, an enterprise-ready, heavily automated Kubernetes platform.

18. SAP

SAP is another familiar name in software that’s started to lean toward open-source development in the past few years. The enterprise application organization focuses on management and business intelligence products, though its open-source projects focus on app and cloud development. In 2022, SAP ranked among the top 10 commercial contributors to open-source software, cementing its status in this list.

Product Portfolio

SAP has six lead open-source projects, including Gardener, an enterprise-level Kubernetes service management tool, and OpenUI5, which uses open standards to streamline web app development. SAP also contributes to popular open-source projects outside its organization, like Linux, Apache, and OpenJDK.

19. Strata IO

Strata Identity Orchestration (IO) is one of the newest companies on this list, completing its Series B funding in 2023, but it’s already making considerable waves. The company’s primary product — the Maverics IO Platform — secures app identity controls in cloud environments and is closed, but the underlying technology is open.

Product Portfolio

Strata IO’s most significant open-source contribution is the Identity Query Language (IDQL) Standard. IDQL is an open standard for identity access policies, making creating and managing identity and access management (IAM) tools easier. A related open-source project — Hexa — translates proprietary IAM policy standards into IDQL to let multiple IAM systems run together in a single environment.

20. VMware

A highly recognized name for many, VMware has solidified its place in tech through cloud computing services and virtualization tools. The organization is also a member of the Linux Foundation, the CNCF, and OpenSFF, all of which it regularly contributes to in addition to developing its open-source projects.

Product Portfolio

Spring is one of VMware’s most popular open-source products. Thanks to its simplicity and emphasis on making Java development faster and safer, it’s the world’s most widely used Java framework. VMware also develops Herald, which supports Bluetooth applications across multiple devices, Harbor, a container image registry, and RabbitMQ, which consolidates numerous messaging protocols to enable easier messaging app development.

What Does Open Source Mean?

Open-source software differs from proprietary software because its source code is freely available for use, modification, and sharing. Only the original developer can change the program with traditional software, but anyone can adapt and contribute to open-source solutions.

Because open-source software can take advantage of a wider pool of resources and developers, this model can streamline development, and help overcome bugs and security issues. However, many companies opt for the closed model to make it easier to monetize their products.

How Many Software Companies Are Open Source?

Not every software company develops open-source software, but almost all use it. Roughly 90% of companies use open-source software, and 30% of Fortune 100 companies have dedicated open-source development offices, according to GitHub.

Virtually every software business uses open-source tools to some extent in developing their products, and many have employees who regularly contribute to open-source projects. While many of these companies’ final products remain closed, this list of open-source-contributing organizations highlights a growing shift toward open development.

Bottom Line: Top Open Source Companies

With so many tech organizations engaging in open-source software today, it can be challenging to say which is the best. These 20 organizations are by no means a complete list but represent leaders in the open-source space, either by the sheer number of their open products or the significance of these tools. As 2023 goes on, these open source companies will be the ones to watch.

]]>
How to Create a Firewall Security Policy, with Examples https://www.datamation.com/security/how-to-create-a-firewall-security-policy-with-examples/ Thu, 09 Mar 2023 18:37:53 +0000 https://www.datamation.com/?p=23912 This firewall policy guide is a beginner’s guide to creating a clear and practical firewall policy for organizations in any industry.

This guide covers all the key elements of creating a firewall security policy and also offers some great firewall policy examples for inspiration.

Writing a firewall policy is a necessary part of security documentation today, and can be challenging. But it doesn’t need to be confusing. 

For more information, also see: Why Firewalls are Important for Network Security

Core Elements of a Firewall Policy

A firewall policy is a document outlining the configuration of an organization’s firewall, including an overview of rules and procedures and who is required to follow them. Before writing a firewall policy, organizations will first need to determine how their firewall will be set up and the architecture and technologies it will use. The National Institute of Standards and Technology has published guidance on effectively configuring firewalls that may be helpful for this process.

The specific contents of a firewall policy will vary from one organization to another. For instance, a large corporation will likely have a longer, more detailed firewall policy than a small business. However, there is a basic firewall policy template that can be used by any type of organization.

Purpose

The first section of a firewall policy is the policy’s purpose. This is a brief statement of one or two paragraphs that explains what the firewall policy is intended to do. It often also includes a short description of what a firewall is, although technical terms are always explained in a later definitions section.

The purpose of a firewall policy is generally to ensure that firewalls are deployed and configured in a universal way across an organization. The firewall policy may also be intended to increase organizational awareness about firewall and security standards.

Audience or Scope

The audience or scope is an important section of the basic firewall policy template. This section specifies who the policy applies to as well as the technical extent of the policy. Firewall security policies may have an audience section and a scope section or just one or the other. They can often be combined into one section.

For instance, the audience for a business’s firewall policy would be everyone working for the business as well as those using the business’s network. This would include every department in the business, all of the employees, anyone responsible for setting up and maintaining network firewalls, and all of the devices and equipment connected to the business’s networks.

Definitions

The definitions section of the standard firewall security policy template is designed to make sure all readers understand what the policy means. Firewalls are part of a larger network security framework, so a firewall policy necessarily includes many technical terms.

However, most of these terms are likely unfamiliar to the average reader. Examples include terms like “firewall,” “host,” or “network device.” Even more general terms like “electronic equipment” should be defined.

The goal of the definitions section is to ensure everything outlined in the firewall policy is crystal clear. There should be no doubt about what any section of the policy means due to the use of undefined terms. This way, readers are less likely to misunderstand the policy and violate it by mistake.

Additionally, in the event that someone does intentionally violate the firewall policy, they can’t claim ignorance because all of the necessary terms are explained in the definitions section.

Policy and Procedures

The bulk of the firewall policy template is the policy and procedures section. This is where an organization lays out in detail all of the various requirements they have for the way that firewalls must be set up and configured. This section will differ the most from one organization to another.

The policy and procedures section should include information on firewall configuration requirements, specific rules that firewalls must use, and requirements for changing and auditing an organization’s firewalls. Additionally, this is a good place to go over elements of an organization’s data privacy strategy that apply to firewall configurations.

For instance, identity and access management are crucial for ensuring an organization’s data is protected from prying eyes. Firewall security policies need to indicate who is allowed to create and control firewalls on the organization’s networks. If unauthorized staff is permitted to create and configure a firewall, it could result in serious security vulnerabilities.

Compliance Requirements

The next section in a standard firewall policy template is compliance requirements. Some firewall policies do not include this section. It may only be necessary for certain industries where specific cybersecurity standards are legally required. In this case, the compliance requirements section will outline specific procedures required by those regulations.

Some organizations may also include a compliance requirements section for internal use. For instance, an organization might have its own cybersecurity and privacy standards enforced by IT leadership. In this case, the compliance requirements section will go over the procedures needed to ensure compliance with those internal regulations.

Change and Exception Requests

Finally, the last section in the standard firewall policy template covers instructions for change and exception requests. In the event that someone in the organization does need some change to be made to the firewall, they will need to know how to properly submit that request.

This last section of the firewall policy should include all the necessary information on such requests, including who to contact, the necessary forms to fill out, and any requirements or limitations for firewall changes and exceptions.

For more information, also see: What is Firewall as a Service? 

3 Examples of Great Firewall Policies

By using the standard firewall security policy template outlined above, an organization can create a functional policy. For those creating a firewall policy for the first time, it may be helpful to see a few examples of well-written policies.

While businesses often do not make their policies accessible online, many universities and educational institutions do. These example policies offer some great inspiration for the kind of language and layout a firewall policy should typically include.

Northwestern University: Interactive Webpage Firewall Policy

The official firewall policy of Northwestern University is a great example of a modern firewall policy. In the past, firewall policies were often published as PDF documents designed to be printed out on paper. There is nothing wrong with this format, but Northwestern’s more modern approach has a few benefits worth noting.

First, this policy follows the standard layout, including all the key sections mentioned above. The page takes advantage of web design to make the policy easy to navigate with bolded, colored headings. Additionally, since this firewall policy is designed as a web page, Northwestern is able to link directly to necessary forms posted elsewhere on their website. They also conveniently link directly to contact information for IT personnel, a helpful feature for readers.

University of Connecticut: Short and Succinct

Not all firewall policies need to be numerous pages long. Sometimes a brief summary of key details is all that is needed. The University of Connecticut’s firewall policy is a good example of a short and succinct policy. They are able to cover all of the key sections of the standard firewall policy template in a brief, single-page document.

One thing to note in this particular policy is the short definitions section. In this case, the University of Connecticut seems to be assuming readers already have some basic knowledge of firewall terminology or have the security background to look up unfamiliar terms.

For some organizations, this approach works perfectly fine. Organizations simply need to be aware of who will most likely be using their firewall policy. If that audience is mainly technical personnel, a shorter definitions section is often reasonable.

Portland City Firewall Policy: A Basic Government Policy

Every organization that uses network technology should have a firewall policy, including government organizations. The city government of Portland has made its firewall policy available online and it offers a perfect example of a basic government firewall policy. This is a good place to start for local administrators or local legislative organizations.

The Portland City government uses the basic PDF layout for its firewall policy. This conventional approach is often used in government organizations since the policy often is still printed out for distribution in offices. Notice that slightly different language is used in this policy compared to non-government policies. For example, “Administrative Rule” is used in place of “Policy and Procedures.”

The Portland City government has also added a brief section on “Intrusion Detection and Prevention.” This section is valuable to include for any type of organization. While not strictly necessary, it is helpful for indicating to readers that the organization is taking steps to actively protect its networks from intrusion.

For more information, also see: Artificial Intelligence in Cybersecurity

Firewall Policy Guide: Creating Your Own

The examples and overview above offer a starting point for crafting a clear and practical firewall policy. Luckily, the process is usually relatively straightforward. For those looking for an easy place to get the ball rolling, the starter firewall policy template below can be used by any kind of organization.

Purpose

Lead with a concise statement about the goal of the firewall policy. For example:

“This policy is designed to protect [ORGANIZATION NAME]’s network and information systems from malicious digital activity by regulating firewall configuration.”

This section may also include a brief description of what a firewall is and an overview of its key elements. 

Audience

The audience section does not need to be long, but it does need to cover everyone impacted by the policy, as well as any relevant devices and systems the policy applies to. For example:

“This policy applies to all [ORGANIZATION NAME] users, departments, and business units as well as any connected devices, systems, and applications.”

Definitions

It may be helpful to write the definitions section last, even though it appears early in the firewall policy template. This section should include any technical terminology mentioned in the rest of the document.

So, write up the whole policy and then go back through and identify any and all terms that the average reader might not be familiar with. Include these terms and their definitions in the definitions section of the policy. Examples of common terms included in the definitions section include IP, VPN, firewall, and firewall network.

Policy and Procedures

The bulk of the firewall policy will be the policy and procedures section. The specific contents of this section are completely dependent on each organization’s unique firewall and security circumstances. Examples of details commonly outlined in this section include:

  • Who is responsible for configuring and maintaining network firewalls
  • What inbound and outbound traffic firewalls must allow or deny
  • The consequences of using unauthorized devices or equipment
  • Who may access the organization’s firewalls
  • Who may change firewall rules, software, hardware, and other configurations
  • The methods used to filter traffic through the organization’s firewalls (i.e. packet filtering, application proxy, etc.)

Compliance Requirements

This section is not strictly necessary. The only organizations that generally have to include a compliance requirements section are those subject to legal cybersecurity requirements.

This section functions much like the policy and procedures section above. It should list in detail any additional procedures or rules required to ensure compliance with relevant cybersecurity and privacy regulations.

Change and Exception Requests

This section of the firewall security policy template details the procedure for submitting a request for changes and exceptions to anything in the policy and procedures section above. Include the contact information of the IT personnel responsible for processing these requests. Additionally, make sure to list and/or link to any necessary forms that must be included with change and exception requests. For example:

  • The timeline for submitting, processing, and executing change and exception requests (i.e. the necessary steps to submit a request, the process for approving or denying requests, and when approved requests can be expected to go into effect).
  • Necessary change and exception request forms.
  • Contact information for IT personnel, such as IT support or a security helpline.

For more information, also see: Data Security Trends

Bottom Line: Utilizing the Standard Firewall Policy Template

The firewall policy template and firewall policy examples discussed above offer a starting point for anyone creating a policy for their organization. These policies are a key part of security documentation. Every organization should have a firewall policy today, but the guide above makes it easy to write up a concise, practical policy for any type of organization.

On a related topic, also see: Top Cybersecurity Software

]]>
3 Types of Vulnerability Scanners Explained https://www.datamation.com/security/3-types-of-vulnerability-scanners-explained/ Wed, 08 Mar 2023 21:20:15 +0000 https://www.datamation.com/?p=23909 Vulnerability scanners — also known as vulnerability assessments — are automated, digital solutions specifically designed to identify vulnerabilities and gaps in an organization’s website, application, and network security systems.

Various reactive cybersecurity tools — such as antivirus software or firewalls — can offer some protection. However, they only respond after a cyberattack or data breach occurs. Modern cybersecurity requires organizations to leverage a combination of reactive and proactive solutions, and vulnerability scanners are no exception.

Continue reading to learn more about vulnerability scanners and the purposes they serve. You’ll also discover three different types of vulnerability scanning your organization could use to bolster your cybersecurity posture.

For more information, also see: How to Secure a Network: 9 Steps 

Vulnerability Scanner Categories

Vulnerability scanners typically fall into four categories — external, internal, authenticated, and unauthenticated. Each of these categories describes a specific area within an organization’s cybersecurity. Below is a brief description of each vulnerability scan category and its purpose.

External vs. Internal

As their names suggest, external and internal scans are designed to identify vulnerabilities in either an external or internal attack scenario. External vulnerability scanners detect gaps an outside attacker can exploit, whereas internal scanners identify potential insider threat attacks.

Authenticated vs. Unauthenticated

Many cybercriminals have the shared primary goal of gaining access to user credentials to execute an attack. Authenticated scans evaluate vulnerabilities threat actors can access with a user account. In contrast, unauthenticated scanners test which vulnerabilities are accessible if an attacker does not have specific access to a website, application, or network.

For more information, also see: Vulnerability Scanning Tools 

3 Types of Vulnerability Scanners

With the remote work trend and cybersecurity risks on the rise, businesses need to leverage multiple cybersecurity solutions. Research suggests many employees report making mistakes that result in repercussions for themselves or their employers while working from home. With the right vulnerability scanners, companies can proactively identify gaps in their cybersecurity program.

Here are three common types of vulnerability scans: Network-based, application, and cloud vulnerability scanners. Learn about their features, pros and cons, how they work, and when to use each type.

1. Network-Based Vulnerability Scanners

A network-based vulnerability scan is one of the most vital types of scans in cybersecurity. These scans identify vulnerabilities across an organization’s entire network.

How Network-Based Vulnerability Scans Work

These scans identify and analyze all the systems and devices within an organization’s network infrastructure. Then it determines how they are connected to the network and adds them to an inventory.

The scanner analyzes each asset in the network inventory to detect vulnerabilities and common exploitable ports and services. Additionally, these scans can identify weak passwords and authentication errors.

Pros 

  • Reduces manual labor and time
  • Identifies and prevents external threats
  • Gauges the overall security of an organization’s network
  • Helps meet compliance requirements

Cons

  • Constant updates required
  • Occasional false positives
  • Implications of vulnerabilities can be unclear
  • Can miss some vulnerabilities

3 Features of Network-Based Vulnerability Scans

Here are some essential features of network-based vulnerability scans:

  • Robust scanning capabilities
  • Centralized hub or dashboard for continuous monitoring
  • Vulnerability scoring and reporting

When to Use Network-Based Vulnerability Scans

Consider using network-based vulnerability scanners to identify vulnerabilities such as unpatched systems, poorly configured network devices, or a weak network infrastructure. Regardless of type or size, every company should consider using network-based scans.

2. Application Vulnerability Scanners

One of the most widely used scanner types is the application vulnerability scanner. Its primary purpose is to scan an organization’s web and mobile applications across the network to find vulnerabilities and potential exploits.

How Application Vulnerability Scanners Work

Application scanners analyze coded and unsecured applications from the web on devices like laptops, tablets, and smartphones. This type of scanner discovers applications on a company’s systems to check for outdated versions, permissions, and security protocols.

These scanners also test code rigidity through penetration testing, another commonly used cybersecurity tool, business and client-side logic, database security, configuration, and more.

Pros

  • Helps assess the current state of vulnerabilities 
  • Shows risks posed by vulnerabilities
  • How much damage vulnerabilities could cause
  • Detects breaches or anomalies

Cons

  • Could be inaccurate or fail to detect vulnerabilities
  • Might be expensive
  • Some scanners cannot analyze custom-built applications

3 Features of Application Vulnerability Scans

Here are essential features to look for in application vulnerability scanners:

  • Identifies the most common app vulnerabilities (SQL and Command injection, cross-site scripting)
  • Provides results with the highest level of accuracy
  • Scalable, flexible, and affordable

When to Use Application Vulnerability Scanners

Companies and individual employees leverage various web and mobile applications throughout the workday. Organizations looking for a basic level of protection or those relying on web applications should use application scanners.

On a related topic, also see: Top Cybersecurity Software

3. Cloud Vulnerability Scanners

Cloud vulnerability scanners essentially analyze a company’s cloud infrastructure for vulnerabilities. Scanners are an integral part of even the most simple cloud security strategy.

How Cloud Vulnerability Scanners Work

A cloud vulnerability scanner works in four stages — scope, scan, report, and remediate. The scanner must identify cloud-based assets in the first stage and how often they need to be checked for vulnerabilities. Policies set by cloud providers have to be factored in during the initial step. Then, the scanner identifies the vulnerabilities within cloud-hosted services.

It reports its findings and lists all vulnerabilities based on severity. Finally, the cloud scanner offers suggestions for fixing these vulnerabilities, allowing a company to work its way down the list and prioritize patching according to severity.

Pros

  • Enables companies to remain secure using the cloud
  • Offers visibility across all cloud assets
  • Assists with compliance
  • Real-time analysis

Cons

  • Limited to only scanning cloud infrastructure
  • Must account for a cloud provider’s security policies
  • Might not consider new or emerging cloud vulnerabilities

3 Features of Cloud Vulnerability Scans

Every cloud vulnerability scanner should include the following key features:

  • Detailed reporting with video of proof of concept exploits
  • Compatible with the most popular cloud service providers
  • Automates the continuous integration and continuous delivery or CI/CD pipeline

When to Use Cloud Vulnerability Scanners

Cloud scanners are automated tools capable of identifying common vulnerabilities in cloud-hosted services, such as Google Cloud Platform, Amazon Web Services, and Microsoft Azure. Any organization using these cloud services should use these specific vulnerability scanners for the best protection.

For more information, also see: What is Big Data Security?

Bottom Line: Types of Vulnerability Scanners

As the business world becomes more reliant on big data and digital technologies, it’s never been more critical for companies to identify and mitigate common security vulnerabilities. A common cybersecurity solution that large corporations and small to medium-sized businesses can use to defend their networks and sensitive data are vulnerability scanners.

Thankfully, IT experts and cybersecurity professionals created various types of vulnerability scanners to help organizations protect themselves – the three types of scanners outlined above are essential for every kind of business. Other notable types of scanners exist, such as database or host-based.

Consider all the different types of scanners available before making any major decisions. Companies should take advantage of free trials before investing to learn if the scanner suits their cybersecurity needs.

For more information, also see: Data Security Trends

]]>
Top 10 Benefits of Data Warehousing: Is It Right for You? https://www.datamation.com/big-data/top-10-benefits-of-a-data-warehouse/ Thu, 02 Feb 2023 13:20:00 +0000 http://datamation.com/2020/06/15/top-10-benefits-of-a-data-warehouse/

Using a data warehouse is becoming increasingly necessary for today’s organizations, especially as many require current information for better decision-making. If you’re thinking about exploring the possibilities of data warehousing soon, these advantages can show you what to anticipate. They’ll also help you determine if a data warehouse is the best choice for your situation.

10 Benefits of Data Warehousing

1. Unlock Data-Driven Capabilities

The days of making decisions with gut instincts or educated guesses are in the past—or at least, they should be. Today’s leaders can now use recent data to determine which choices to make. A data warehouse makes that possible.

Making effective use of information means eliminating data silos and instances where single departments control most or all of the information. A data warehouse can prevent those unwanted circumstances. Then, it’s easier for the appropriate parties to source the information they need without going through other departments to get it.

A data warehouse serves as a centralized information repository. When people can go directly to one place to get the necessary information, they’ll feel more confident using it to make decisions that shape an organization’s future.

2. Maintain Data Quality and Consistency

Data could become useless to an organization if it is poor quality and shows numerous inconsistencies. However, a data warehouse can support improved quality and consistency, provided people develop a system for finding and fixing errors before transferring content to the data warehouse.

Preparing the data could mean removing duplicate records, putting all data in a standardized format, and correcting outdated data. Ensuring data warehouses contain high-quality information facilitates using those repositories to their fullest potential.

Imagine if a customer service representative could not contact customers about defective products and associated recalls because they needed the current details for those individuals. Alternatively, if a data warehouse contains a high percentage of duplicate records, it could cause a person to make the wrong decisions. Creating a quality framework for people to follow is an excellent way to make the data warehouse’s contents as valuable as possible for everyone who uses it.

3. Use Data From Numerous Sources

Most organizations don’t have all of their information in one place. It comes from various departments. The customer service team may have statistics about how many people contact them monthly about specific issues. Then, the marketing department probably has data about specific campaign outcomes and whether they fell short of or surpassed expectations.

The great thing about a data warehouse is it combines data from all of those places within the business, making it more usable for different needs. The warehouse puts that information in a consolidated format, shortening the time frame required for people to get the insights they need.

Accessing information gathered throughout an organization also minimizes the inconsistencies that can occur if people don’t have holistic data. Suppose a leader makes a decision without the benefit of information from all affected departments. Then, they may reach faulty conclusions that compromise the outcomes and impact the organization by placing it under preventable threat.

4. Realize the Power of Automation

Data warehousing allows people to experiment with how automation might improve their businesses. Automating various steps within operations is becoming more popular, especially as people realize the value of using automation to prevent costly mistakes and accelerate workflows.

A market analysis predicts global industrial automation will be worth $265 billion by 2025. That’s impressive since the 2020 worth was $175 billion. Businesspeople can rely on data warehouses to support various automation initiatives. They might use software-defined workflows to automate data access and transfer, shortening the time required to gather information for auditors, potential investors, or other parties.

People may also automate data analysis, allowing them to uncover insights faster than before. Other possibilities are to automate error detection and logging. Then, users will become aware of potential problems more quickly and know where to start in finding the root causes. A clear understanding of how an organization uses a data warehouse will highlight some of the most appropriate ways to pursue automation.

5. Respond to Business Growth

As companies grow, they often expand into new markets or serve larger customer segments. A data warehouse can contain the information people need to pinpoint the extent of a current growth period and how long it’ll last. Users can also retrieve information to study what likely caused the business’s success. Was it a new product, lower prices, or offering in-demand items at the most opportune times that made people most interested in and loyal to a company?

Business leaders frequently access location data before approving expansion options. Where does it make the most sense to open a new distribution center, convenience store, or dental practice? A company may consider offering subscription services for beauty boxes, fresh food kits, or baby essentials. A data warehouse contains the information that can pinpoint the most viable cities or communities to serve during the initial rollout.

Decision-makers may also depend on a data warehouse to learn whether now is the best time to hire new team members for specific departments or to cope with seasonal demand spikes. Although growth periods are often exciting, uncertainty characterizes them, too. A data warehouse holds the information that can make people more confident in choosing how they’ll respond to growth and how to make that success endure.

6. Get Data Warehousing on a Subscription Model

Data warehouses typically require significant investments and upfront costs. Those realities can make some executives balk at creating and using such offerings. However, the data warehousing-as-a-service model eliminates most of those obstacles. It allows people to pay for data warehouse usage through a flat fee and only to get specific desired services.

A Maximize Market Research report expects a 21.7% compound annual growth rate for the market between 2022 and 2029. The analysts said the ease of use and ability to access the data over the internet with an application programming interface (API) were some of the factors driving growth. They also pointed out how difficulties associated with the COVID-19 pandemic made more business leaders realize they needed to access current and dependable data to minimize disruptions.

Snowflake, IBM, Google, and Microsoft are some of the top companies offering data warehouses through subscription tiers. Company leaders thinking about using them should first make lists of their must-have features and ponder how such products could help them meet data warehousing goals.

7. Learn More About Your Customers

It’s becoming more common for companies to offer their customers personalized content. Doing that can increase the chances that people spend more time interacting with a service or website or cause them to spend more money on products than expected.

Personalized recommendations can become significant parts of a business model. Consider how most Netflix users decide what to watch after the service’s algorithm suggests content. If people enjoy what they consume, they’re more likely to remain subscribers and have overall good impressions of using Netflix to stay entertained.

Data is the essential ingredient of customer intelligence. What are their pain points, and how could your company ease them? Which factors make people more or less likely to complete a purchase at your site? How did customer behavior change after a recent site redesign? These are all questions a data warehouse’s content could answer.

8. Enjoy Interoperability Between Physical Solutions and the Cloud

Business leaders are embracing the cloud and realizing how convenient it is to have data stored there rather than solely using hardware in company headquarters. Some of today’s data warehouses are entirely cloud-based. Others work at least partially in the cloud, supporting company representatives yet to transition to the cloud fully.

Data warehouses provide the flexibility to work well regardless of a company’s current infrastructure and information storage practices. Thus, no matter what stage a business is in with its cloud usage, there’s a data warehousing solution to suit.

Cloud-based solutions are convenient for people who need to access data from anywhere. Such individuals could include traveling sales representatives, remote workers, and executives who want to compare company performance across multiple sites.

9. Retain the Security of Your Data

Keeping information in multiple locations makes security more challenging. Many executives don’t know how much data they have, let alone how to access it. Since a data warehouse allows storing data in one location, it raises the visibility of the information and facilitates a cybersecurity team’s plans to secure it.

It also helps that most data warehousing platforms have built-in security features. Some allow setting things up to block harmful SQL code from outsider attacks. Others restrict how much data a person can see at a time, minimizing the chances they’ll use the content for unapproved purposes.

Organizations can also specify which people can access a data warehouse’s material and why. Then, individuals only see information that directly relates to their role or task. Further, some data warehouses may lock users out if they try to access them from unusual locations, making it more difficult for online intruders to exploit weaknesses.

10. Study Historical Overviews of Business Activities

Having the most up-to-date information about a specific facet of a business is valuable, but it may only show part of the picture. People in positions of authority often need to see how an organization has changed over time. Those insights allow them to make more confident predictions about what’s on the horizon.

Fortunately, a data warehouse can contain historical information, allowing a person to obtain the necessary information through a few queries. Executives can typically get the content themselves without support from IT teams. That capability enhances productivity and keeps an organization running smoothly.

Historical data can also support preparedness for teams throughout an organization. People usually can’t predict the future with total certainty, but they often find the past holds valuable clues about what might happen soon.

3 Examples of Data Warehouse Benefits in Action

Now that you know some of the main advantages of data warehouses, you’re probably curious about how people use them in real life. These examples help answer that all-important question.

1. Compiling Data for Cancer Research

Efforts are underway to improve cancer data interoperability. Succeeding in this area could reveal new treatments or show which types of cancer respond best to certain widely utilized interventions. However, challenges arise since people often record clinical data in various unstructured formats. Thus, extracting the data for further study becomes prohibitively time-consuming.

Researchers solved this problem by creating an automatically updated data warehouse for cancer patient information. It contains material about 67,617 people with six tumor types. Results from this landmark project showed the automatic-updating feature allowed users to get the most current test results and treatment outcomes. They could then use that information to improve prognoses for current and future patients with cancer.

2. Supporting Information Sharing Between Multiple University Departments

The University of Minnesota’s motto is “Driven to Discover.” It comes as no surprise that the institution has an enterprise data warehouse that aids authorized users in answering essential questions. The data warehouse enables five of the university’s central departments to build and publish visualizations, dashboards, and analyses.

People from the university’s campuses and colleges, as well as individual students, can author data queries and build databases and visualizations. The data warehouse’s scalability makes it a future-proof platform able to meet diverse needs now and later.

3. Tracking Cost and Availability Data for Military Weapons

Virginia’s Defense Acquisition University is a U.S. Department of Defense arm that teaches military and civilian staff and federal contractors about acquisition, logistics, and technology relevant to their work. The organization created the Maintenance and Availability Data Warehouse. It stores more than 12 years of maintenance records from 46 data systems used by the military.

The massive data warehouse contains more than 1.6 billion records of maintenance and supply-related transactions in a standardized format. People use this resource to differentiate between each military weapon’s standard and unusual maintenance and availability aspects. Such specialized information supports informed planning and better national security readiness.

What Are the Disadvantages of Data Warehouses?

Time Required for Data Warehousing’s Early Stages

Many leaders must pay more attention to the time required to plan, design, and populate their data warehouses. These phases can collectively take the better part of a year. However, the time frames vary based on metrics such as the amount of information going into the warehouse, its quality level, and the number of formats.

People must set realistic expectations for their data warehousing initiatives. Otherwise, they may become prematurely discouraged and give up before seeing how data warehouses can help their organizations.

Risk of Outdated Technologies and Performance Degradation

A data warehouse is not something people can let run with little oversight after getting it established. Instead, relevant parties must ensure the system runs with up-to-date technologies and performs smoothly, even as data volumes increase.

All data warehouse projects need ongoing support and investment. When company leaders opt for on-premises solutions rather than those operating in the cloud, there’s an increased risk that the data warehouse’s infrastructure may become outdated. If that happens, people often notice progressive slowness when running queries or otherwise interacting with the system.

Possibility of Not Using the Warehouse Enough to Justify the Resources

Business leaders may find the cost-benefit analysis of data warehouses does not warrant building them. Besides the resources necessary to get the data warehouse operational, a company may also hire extra team members to prepare information for the data warehouse or oversee how things operate.

Decision-makers must verify that the planned use cases justify the implementation and upkeep expenses. If they decide against building a data warehouse, investing in an end-to-end business intelligence platform is another solution. Outlining the specific ways a company will use and benefit from a data warehouse is an excellent way to see if the associated costs make sense.

When Does an Organization Need Data Warehousing?

It’s not always easy to determine the right time to invest in a data warehouse. However, many company representatives start thinking about this option when their workflows require querying data from numerous disparate sources. That can be time-consuming, but a data warehouse typically makes it much more manageable.

Data warehouses can often improve productivity in cases where employees struggle to use data because it exists in many formats. Taking the time to clean up the data before it goes into the warehouse can make the information more usable later. That’s especially true when companies set rules for how to format new data.

Organizations become more likely to need data warehouses as their information volumes rise. Inefficient queries caused by the lack of a data warehouse aren’t significant issues when companies only work with small databases. However, as the total information reaches the petabyte scale and beyond, those slow queries could significantly disrupt business processes, necessitating using a data warehouse as soon as possible.

Deciding Whether You Need a Data Warehouse

Data warehouses can be highly beneficial, mainly as more company leaders rely on accurate information to drive their business decisions and maintain competitive advantages. However, they also require significant ongoing investments.

People tasked with exploring the benefits of data warehousing should compare those characteristics with a company’s primary goals. They should also ensure employees across the organization will use the data warehouse often enough to support its creation and upkeep.

Evaluating these aspects makes it easier to judge whether creating a data warehouse is the best option for an organization’s current and future needs. Then, people will have the knowledge needed to feel confident in their ultimate decision.

]]>
5 Network Segmentation Case Studies https://www.datamation.com/security/network-segmentation-case-studies/ Mon, 23 Jan 2023 20:49:51 +0000 https://www.datamation.com/?p=23809 Network segmentation separates a large network into smaller, individualized parts. Companies perform network segmentation to strengthen their cybersecurity posture, since each segment enables setting particular security rules.

The following case studies can help companies see how network segmentation is being used by organizations in different industries.

5 network segmentation case studies

  1. ServiceNow
  2. Oil and Gas Refinery
  3. Modern Woodmen of America
  4. Clothing Manufacturer
  5. Children’s Mercy Kansas City

1. ServiceNow

ServiceNow is a leading IT service management provider. Joel Duisman, the company’s principal IT security architect, recognized the need to improve an existing network segmentation strategy. He wanted to strengthen the protection of the company’s core services and domain controllers.

He chose service provider Illumio to meet those needs and moved forward with a phased rollout. The ServiceNow IT team appreciated how Illumio offered real-time visibility and gave consistently high protection in a multicloud environment.

“I sleep better at night knowing that Illumio closes the doors on potential attacks against our domain controllers. The demonstrable risk to the environment is noticeably lessened,” Duisman says.

Industry: IT services

Network segmentation product: Illumio Secure Cloud

Outcomes:

  • Improved compliance with client audits
  • Provided flexibility across cloud and on-premises data
  • Enhanced protection of multiple systems without interruptions

2. Oil and Gas Refinery

Leaders at a major oil and gas refinery were experiencing unexplained data loss that made it more challenging to track emissions and otherwise stay in compliance with industry regulations. They hired the Champion Technologies team to troubleshoot after they couldn’t pinpoint the problem themselves.

Champion Technologies performed an in-depth site survey to compare the refinery’s current setup to best practices. Network segmentation was one of the recommended improvements. The providers also updated network components and provided monitoring software. These improvements give employees a better understanding of what’s happening on their network and ensure they get timely alerts to avoid regulatory fines.

Industry: Oil and gas

Network segmentation product: Champion Technologies provided Layer 2 network switches, Syslog software and segmented network

Outcomes:

  • Stopped a known data loss problem
  • Improved network security
  • Tightened industry compliance

3. Modern Woodmen of America

Modern Woodmen of America is a fraternal financial services organization that aims to bring clarity through services, such as retirement planning and life insurance.

The organization uses a self-service portal that members can access anytime and anywhere. However, its traffic management system only handled virtual infrastructures, leaving a significant segment of traffic unmanaged. The company worked with service provider 27 Virtual to transition to VMware NSX-T and solve that problem.

“The inability to set up segmentation policies and east-west firewalling across dev, stage, and prod environments created a security gap that could be exploited by sophisticated threat actors,” says Zach Lotz, senior network engineer, Modern Woodmen of America.

“Once an attacker gained access, they’d have free reign to spread throughout the network.”

However, migrating to VMware NSX-T caused notable changes.

“The best part of segmentation with NSX-T is the ability to start broad — development versus production — and then go more granular as needed, even down to the application level,” Lotz says.

“This allows us to lock down our network to the point where only known traffic can communicate while everything else is blocked. Any anomaly is quickly identified and dropped.”

Industry: Financial services

Network segmentation product: 27 Virtual assisted the client in switching to VMware NSX-T for its network segmentation needs

Outcomes:

  • A more modernized network infrastructure
  • Secure 24/7 access to apps by staff and members
  • Better security against unknown traffic

See more: Network Segmentation vs. Microsegmentation

4. Clothing Manufacturer

A clothing manufacturer approached Burwood Group because of a need to get back into payment card industry (PCI) compliance. The service provider performed a network discovery process to learn more about the manufacturer’s apps and how people used them. The team suggested a network segmentation strategy after completing that assessment.

This change allowed the company to go from more than 1,600 security policies to 234. Network segmentation also made it easier to stay compliant and be more proactive about cybersecurity.

Industry: Manufacturing

Network segmentation provider: Burwood Group

Outcomes:

  • Decreased security rules while reducing vulnerabilities
  • Improved the company’s cybersecurity posture
  • Minimized overall business risks

5. Children’s Mercy Kansas City

Children’s Mercy Kansas City is a 700-bed medical facility with a growing and varied collection of connected medical assets.

People also collaborated with multiple departments but lacked cohesive data security policies to follow when doing so.

Leaders chose Medigate by Claroty to enhance network segmentation capabilities and accommodate rapid growth. The product gave a risk-scored asset inventory to show people vulnerabilities within the facility’s connected devices.

“Medigate has been a necessary investment,” says Tarunjeet “T.J.” Mann, chief information security officer, Children’s Mercy Kansas City.

“They have provided the means for us to protect and monitor every connected device in a hospital at machine speeds.”

The solution also auto-generated security policies for each network segment, reducing potential threats and giving people better network oversight.

Industry: Health care

Network segmentation product: Medigate By Claroty

Outcomes:

  • Better asset visibility
  • The elimination of numerous manual and outdated workflows
  • More effective collaboration among staff

Bottom Line

These case studies show examples of how network segmentation is being used in various industries: IT services; oil and gas; financial services; manufacturing; and health care.

Clients selected a range of providers serving the network segmentation market for implementations: Illumnio; Champion Technologies; VMware; Burwood Group; and Claroty.

Together, the organizations’ network segmentation solutions improved numerous aspects of their networks:

  • Provided flexibility across cloud and on-premises data
  • Stopped a known data loss problem
  • Better security against unknown traffic
  • Decreased security rules while reducing vulnerabilities
  • The elimination of numerous manual and outdated workflows

See more: 5 Top Network Segmentation Trends

]]>
7 Network Detection and Response (NDR) Case Studies https://www.datamation.com/security/network-detection-response-case-studies/ Tue, 20 Dec 2022 22:41:03 +0000 https://www.datamation.com/?p=23696 Cybersecurity can be challenging as networks grow increasingly complex and demand for IT talent rises. Amid these trends, automated network detection and response (NDR) solutions are some of the best tools companies have at their disposal.

NDR platforms scan networks for unusual activity to find and respond to potential threats. These early warnings help prevent and mitigate cyberattacks, even when companies don’t have enough IT staff to continually monitor their systems. Here’s a closer look at how several organizations in different industries use NDR tools to stay secure:

7 NDR case studies

  1. Coca-Cola Bottlers Management Service
  2. City of Las Vegas
  3. Narvar
  4. Asante Health
  5. American University
  6. Lake Trust Credit Union
  7. Rackspace

1. Coca-Cola Bottlers Management Service

Coca-Cola’s China-based bottling plant SCMC has considerable network needs. As it implements more Industry 4.0 technologies, the need for fast-acting, effective security becomes increasingly clear.

SCMC turned to Sangfor Technologies for managed security services for help monitoring this network. Automated threat hunting, where artificial intelligence (AI) algorithms scan for potential breaches, is an integral part of that system. The plant uses this technology to continuously look for attack attempts on their Internet of Things (IoT) devices without needing a larger in-house security workforce.

One of the leading benefits of security automation is how it reduces IT workloads. That was a crucial advantage for SCMC, which recognized it is most vulnerable around the holidays when IT staff may not be in the office. That’s no longer a security threat now that the network detection and response are automated.

SCMC’s solution also monitors for non-compliance with company standards. This helps streamline regulatory compliance and find cases of employee misuse, addressing vulnerabilities from human error or malicious insiders.

Industry: Manufacturing

Network detection and response offering: Sangfor Technologies Managed Security Service

Outcomes:

  • Provided 24/7 network monitoring
  • Accelerated threat responses to under half an hour
  • Reduced security vulnerabilities around holidays

2. City of Las Vegas

The growing smart city movement means governments are implementing IoT connectivity throughout urban areas, making cybersecurity a matter of public safety. Las Vegas joined this trend and uses network detection and response technologies to keep it safe.

Las Vegas is home to automated public shuttles, IoT-connected security cameras, connected traffic systems, and more smart city infrastructure. These systems make life more convenient for citizens and visitors but introduce new security challenges. With so much connectivity, one cyberattack could cause widespread damage.

The city uses self-learning AI threat detection to protect this infrastructure. The solution teaches itself how each device and city employee typically behaves, making it easier to spot unusual network activity.

Instead of defining a threat, the system learns the city’s typical networking footprint and is suspicious of anything outside of those boundaries. As a result, it can detect and isolate attacks from known and novel malware strains. Since its implementation, it’s already stopped a spear phishing attack that might’ve bypassed traditional security controls.

Industry: Government

Network detection and response product: Darktrace

Outcomes:

  • Enabled autonomous responses to novel threats
  • Shortened response time to a matter of seconds
  • Allowed further smart city growth

3. Narvar

Customer experience (CX) platform Narvar helps online businesses build order management and tracking systems to meet shoppers’ needs. With more than 7 billion interactions annually, managing these systems carries significant cybersecurity requirements.

“We realized we needed to deploy a security solution that can scale with our operations without creating any disruptions or delays,” says Ram Ravichandran, CTO, Narvar.

The answer was an NDR system from Blue Hexagon. This solution uses deep learning (DL) to inspect transactions in near real-time, including zero-day exploits.

The NDR solution scans traffic flows in under a second on average. This ensures Narvar’s clients can offer seamless e-commerce experiences to their customers without jeopardizing security. Because the solution is also cloud-native, it scales across the company’s multiple availability zones as quickly as needed.

Similarly, the NDR system scales automatically with peak traffic during the holiday season. This ensures a boom in server requests doesn’t endanger Narvar’s clients’ security.

Industry: E-commerce

Network detection and response product: Blue Hexagon

Outcomes:

  • Near real-time threat detection
  • Protection from zero-day exploits
  • Near zero downtime scalability across multiple availability zones

4. Asante Health

Hospitals store vast amounts of sensitive data, and the U.S. medical system is already one of the most expensive globally, making the costs of any breaches severe.

Asante Health, a health care provider with more than 200,000 customers, understands these risks well. In 2021, one of their partners fell victim to a ransomware attack that left them offline for 30 days. Asante turned to a network detection and response solution from ExtraHop soon after to prevent a similar event in the future.

Like other leading NDR solutions, this system uses machine learning (ML) to determine a baseline for normal network behavior. The longer it’s in use, the better it gets at eliminating false positives while improving response to actual threats. This automation lets Asante see their increasingly complex network without a larger IT workforce.

SSL decryption lets NDR users see everything happening on the network, even where conventional tools may not work. This increased visibility helped the organization find and address a substantial SQL vulnerability before cybercriminals could capitalize on it.

Industry: Health care

Network detection and response product: ExtraHop Reveal(x)

Outcomes:

  • Improved network visibility
  • Boosted detection accuracy
  • Stopped a significant SQL vulnerability

5. American University

Education deals with considerable amounts of personally identifiable information (PII). Higher learning institutions, like American University, must protect thousands of sensitive records while ensuring cloud systems work efficiently.

American University manages roughly 60,000 users across 20,000 devices and 700 servers. The institution uses NDR automation to monitor their complex network.

“Intrusion detection requires a security analyst to sift through volumes of signature hits. … We really needed a better, faster way to drink data from the security fire hose,” says Eric Weakland, director of information security, American University.

The university’s new system automates attack detection and response through AI. This solution also prioritizes tasks automatically, determining which events deserve the most attention and assigning them to IT staff accordingly. That organization enables faster responses to the most pressing issues.

Over time, the system also recognizes attack patterns, suggesting broader security changes to improve protection. The university can use this insight to ensure they stay secure as their network grows and adapts.

Industry: Higher education

Network detection and response product: Vectra Cognito NDR

Outcomes:

  • Reduced response times by 20%
  • Gave IT teams more time to focus on critical issues
  • Improved visibility across the attack life cycle

6. Lake Trust Credit Union

Lake Trust Credit Union, one of the largest credit unions in the U.S., manages $1.8 billion in assets across 22 branches with more than 175,000 members.

Demand for digital banking means Lake Trust must protect an ever-increasing network of devices nationwide. Consequently, its security tools must be fast-acting and flexible, catching all threats possible across various data centers, endpoints, and departments. NDR was the natural solution.

Lake Trust’s NDR system collects and analyzes data without third parties, reducing IT sprawl and improving security. It then forms behavioral models for each device and user to spot unusual behavior, regardless of where in the network it occurs. Because it temporarily stores network telemetry, it can also help pinpoint where these threats arise, informing bigger security improvements.

This NDR solution also checks the network for regulatory compliance, highlighting anything that may land the credit union in legal trouble. IT teams can then implement any necessary patches to comply with increasing data privacy regulations.

Industry: Banking

Network detection and response product: Cisco Secure Network Analytics

Outcomes:

  • Improved threat visibility across remote endpoints
  • Streamlined regulatory compliance
  • Accelerated threat response despite a small security team

7. Rackspace

Rackspace is an IT service provider, offering managed cloud services to two-thirds of the Fortune “100” and more than 300,000 customers.

Originally, Rackspace responded to threats as they arose, but they soon found this approach was insufficient. The company needed a more proactive approach as cybercrime and security workloads grew. Automated network detection and response tools paved the way for that change.

Rackspace’s NDR solution classifies all network traffic in real-time, providing rapid and context-rich analysis of any security issues. These insights ensure teams can see larger trends that lead to vulnerabilities on top of responding to threats faster. They can adapt to improve their security and hunt for threats proactively as they gain more of this information.

This automation and information let Rackspace’s security workers spend less time responding and more time hunting. As a result, they can prevent more breaches and become better at stopping threats over time.

Industry: IT services

Network detection and response product: Symantec Security Analytics

Outcomes:

  • Shortened response time frames from hours to minutes
  • Improved network visibility
  • Enabled proactive threat hunting
]]>
What is a Firewall? Definition, Features, and Types https://www.datamation.com/security/firewall/ Fri, 21 Oct 2022 21:39:33 +0000 https://www.datamation.com/?p=23476 A firewall is a type of software or firmware that prevents unauthorized users from accessing a network as part of a broader network security strategy.

Many of today’s digital devices have them built in. Popular operating systems, such as macOS, Windows, and Linux, have prepackaged firewalls for their users, including essential components of network security. 

See below to learn all about what makes up a firewall:

Top features of a firewall

Most firewalls share several common features. However, because companies can choose from a wide selection of vendors for a solution, it can be challenging to know which will offer the right protection.

Below is a list of firewall features to help you determine which solution best suits your business:

  • Bandwidth control and monitoring: Every firewall should have this feature, which is sometimes called traffic shaping. It allows you to control the available bandwidth of your network for sites, applications, and users.
  • Web filtering: Also known as content filtering, it oversees data packets your computer sends and receives to weed out any compromising, flagged, or forbidden content.
  • Logging: An effective firewall can log network traffic, giving you updated information about what’s happening. It can show you vulnerabilities and provide information about an attack happening on the web.
  • Sandboxing: Sandboxing takes files or executables — a file with instructions or options to complete a function on your device — and opens them in a test environment. This feature essentially opens and runs files to scan for any malware or suspicious activity to protect the end user.
  • Threat prevention: A firewall with a threat prevention feature identifies and blocks attacks before they cross into a network, helping companies avoid cyberattacks and their negative implications.
  • Application and identity-based inspection: Companies are constantly changing their applications, so they can use a firewall with an application and identity-based inspection feature. This lets a company apply specific policies to applications or users within the organization to better control their networks.
  • Scalability: Using a scalable firewall solution is important as more companies incorporate digital technologies into their business. They grow as organizations evolve and their cybersecurity needs become more complex.

See more: How Do Firewalls Work? Basic Firewall Fundamentals

Over time, firewalls improve due to the increasingly threatening nature of enterprise cybersecurity. Networking is a market that prioritizes enhanced security, especially as cyberattacks become more frequent and intense.

See more: How Firewalls are Used by Deakin University, Black Box, Palo Alto Networks, Modis, and Keysight: Case Studies

Types of firewalls

Companies can uses several types of firewalls, such as: 

  • Next-generation firewall (NGFW)
  • Packet-filtering firewall
  • Proxy firewall
  • Stateful inspection firewall
  • Application firewall

See more: Types of Firewalls Explained

Conclusions

Although firewalls are effective cybersecurity tools for companies, they don’t guarantee the security of your network or data.

Companies must incorporate several other cybersecurity solutions, like antivirus software, encryption tools, managed detection and response (MDR) services, and penetration testing, for the best protection.

However, without an effective firewall, your company’s network is more exposed and could become a target for cybercriminals. Consider deploying an advanced firewall to protect your company and improve its cybersecurity posture.

See more: Why Firewalls are Important for Network Security

]]>