autodesk-realistic-game

Making realistic game characters: rigging and body deformation

Designing compelling 3D characters for your video game is one thing; making sure they look and behave realistically is another. From leveraging motion capture data and retargeting animation with Maya, to choosing an A-pose over a T-pose during development, Santa Monica Studio went the extra mile to ensure that their characters look, feel, and act real. Here are a few best practices from a GDC 2019 presentation by the studio’s Lead Character Technical Artist, Axel Grossman, for making realistic game characters.

Producing animation with mocap

Use motion capture

Image courtesy of Santa Monica Studio

In getting your characters ready for battle, nothing beats motion capture instead of hand-keying movements. In the case of God of War, MotionBuilder was used for its speed and ability to use real-time camera work. A transfer-rig matching Maya’s coordinates was utilized, allowing for objects baked in MotionBuilder to be imported directly onto nodes in Maya, allowing for scenes to be built in MotionBuilder.

Make use of animation retargeting

Image courtesy of Santa Monica Studio

Animating character cycles one by one? Ain’t nobody got time for that! One tool you can take advantage of to speed up this process is an animation retargeter, which allows you to swap out characters and maintain the same animation cycle.

Striking a realistic pose

Using Pose Space Deformation (PSD) and Radial Basis Functions (RBF)

For your character to feel real, they need to act out the part in each of their movements. Simply choosing an A-pose over a T-pose can have an impact on how realistic your character’s movements are, with A-poses making for a more natural appearance and proper shoulder height. Pose Space Deformation (PSD nodes) uses Radial Basis Functions (RBF) to ensure that poses are properly blended from movement to movement.

Image courtesy of Santa Monica Studio

For the making of God of War at Santa Monica Studios, PSD was used primarily for muscle corrections for Kratos and Baldur, facial fixups and corrections. RBF, on the other hand, is about correcting poses, moving joints around. In the case of God of War, it was used for armor deformation on Kratos and other heavily armored characters, such as The Traveler, some props and environment rigging, and Freya’s hair.

Rigging and body deformation

Remember driver coverage

When it comes to making sure that props and objects in your environment act according to your character’s movements, you’ll want to keep driver coverage in mind. Drivers are basically our way of representing the field on which the poses play. Without proper driver coverage, you can fall out of your pose space and get pops and visual glitches. Using a ROM (range of movement), you can determine all the spaces where something can move.

Image courtesy of Santa Monica Studio

Without proper driver coverage, you risk running into problem areas on your characters.

Build dependencies for body deformation

You’ll want a system or hierarchy in place for body deformation so that all your world space matrices allow for your character to easily call up their poses. This might involve deforming by clavicle first, then humerus, then clavicle and humerus together.

Image courtesy of Santa Monica Studio

Traditional skeletal hierarchies combined with linear skinning lose volume. When it comes to body deformation, keeping your character in its pose space becomes an increasingly greater challenge. Linear skinning can cause things like a peck or deltoid popping out and becoming fat and round when it should be thinning out during a movement of the clavicle.

Image courtesy of Santa Monica Studio

Focus on areas most visible to the camera

When rigging your character, you’ll want to pay extra attention to areas that undergo a high degree of rotation, for example, twisting along bone axes such as the shoulder (where you can get the candy-wrapper effect) and other areas most visible to the camera, like the neck.

competitive_edge_advantage

Small and Midsize Companies Find Their Edge with Advanced Analytics

Small and midsize companies are constantly under pressure to differentiate themselves in a highly disruptive environment. Every rival—from one-man startups to large conglomerates—can rewrite the competition playbook forever with one new business model, one breakthrough offering, or one creative process. It doesn’t matter if the industry is mature; the company must create and exploit value immediately to squeeze out every last drop of value from operational efficiency, data-driven insights, and revenue growth.

Small and midsize companies are well-known for pivoting and changing direction with a speed and tenacity that’s difficult for even a multi-million-dollar enterprise with unlimited resources to duplicate. However, this quality leads to significant advantages only when coupled with outcome-oriented, real-time insights made possible by the latest analytics technology.

In the IDC Analyst Connection whitepaper “Analytics for SMBs: Sharpen Operations, Capitalize on Business Opportunities,” sponsored by SAP, Ray Boggs, vice president of small and medium business research at IDC, acknowledged that “business analytics and business intelligence can inform almost every aspect of a growing company’s operations.”

Prime Future Success with Data and Advanced Analytics

Whether its classic performance measurement and financial scrutiny; regular sales, costs, and profit reporting; or HR and workforce measurement, analytics can help identify areas for greater efficiency, untapped revenue generation, process improvement, and employee training. Essentially, businesses have no choice but to add advanced analytics to their digital repertoire. If we take a moment to think about the big brands that have disappeared in recent years, it is clear that their demise was the result of limited or delayed insight on the evolution of customer behavior and market dynamics.

Everything that a small and midsize company does is centered on the customer. For this reason, advanced analytics is a great fit when it comes to dissecting and truly understanding customer needs and shopping patterns with a swift, in-the-moment experience. More important, as most successful companies have shown, the business model must also leverage that information to add value to the customer experience through, for instance, micro-personalized recommendations, content, and campaigns.

A prime example is Snow Peak, an outdoor gear retailer that has grown from a single store in the mountains of Japan to a multinational enterprise with over 100 stores. The company attributes its growth to its commitment to understanding customers and offering products that closely meet their needs. However, Snow Peak realized that its use of Microsoft Excel, an outdated enterprise resource planning (ERP) system, and handwritten notes were not effective ways to share customer tastes, preferences, and buying histories with other salespeople and event planners. For example, staff may identify the right product for a customer—only to later find that the item is out of stock and miss an opportunity to achieve a sale and make a customer happy.

By adopting predictive analytics in the cloud, Snow Peak centralized, unified, and controlled fragmented information about customers, inventories, and all other aspects of the business and made this data available to executives, salespeople, and other business users in real time. Furthermore, it optimized inventories by coupling supply and demand data and keeping it up to date.

Snow Peak’s decision to scale its customer experience with advanced analytics in the cloud is one of a variety of use cases that can greatly improve the performance of a small and midsize business.

Additional applications that are just as impactful—if not, more—include:

  • Real-time collaboration: Employees, suppliers, partners, and customers can collaborate together with access to in-the-moment, accurate data, which is a critical component of keeping everyone in the value chain engaged and informed
  • Operational optimization: Companies can balance profitability, quality, and cost control with on-the-fly what-if analysis and insight acceleration through machine learning
  • Extended supply chain: Predictive analytics and the Internet of Things provide supply chain operations with the information needed to respond to ever-evolving market expectations while maintaining profitable sales and operations, demand fulfillment, response and supply planning, and inventory optimization
  • Core business processes: Emerging analytics technology—including machine learning, artificial intelligence, and blockchain—can help create a well-skilled, productive workforce; free employees from repetitive, low-value tasks; optimize supplier negotiations; and speed accurate decision making and planning

Even though small and midsize companies have fewer employees, less cash flow, smaller inventory, and less diverse product lines than their larger counterparts, the ability to know everything about themselves and their customers brings an opportunity to stay one step ahead of the competition. But the data is only as good as the business’ ability to capture, process, analyze, communicate, and act on it in a timely, efficient way. By using advanced analytics, small and midsize companies can acquire the skills and mindset needed to turn decision-making processes and strategies into transformational, leading-edge innovation.

autodesk-3ds-max-header

What’s New in 3ds Max

See how 3ds Max has evolved since 2016!

3ds Max has provided you many updates over the years, and we will continue to deliver to you top quality updates, for better performance and improved workflows.

Click on the image below to get a glimpse at what we have been up to.

Bringing your ideas to life

Autodesk is committed to responding quickly to 3ds Max user feedback. That’s why this release includes many of which were submitted by the 3ds Max user community at 3dsmaxfeedback.autodesk.com.

sap-summer

Summer Fun: Data-Driven Guide to Navigating Your Summer

Summer is here and in full summer mode. Whether you’re on vacation or covering for someone on vacation, sit back, relax, and take a moment to see what this summer’s trends are and how they might be impacting your business.

Hot, Hot, Hot Data

Have you found yourself sitting on your porch exclaiming that this is the hottest summer ever?! You would be correct. According to Climate Central the planet has been gradually increasing year after year. In fact, “16 of the 17 hottest years on record have occurred this century,” reports Climate Central.org.

Get up to speed on your investments in sun umbrellas, swim suits, and pool supplies— the heat is here to stay. It is having an impact on consumer and summer trends that have been changing over the past years.

I Scream for Ice Cream

This heat is warming up markets. We all know the best way to cool down on a hot day is ice cream. It’s not only store sales that are impacted but, the sales of packaged ice cream rose 6.5% in the last year according to dairyfoods.com .

Ice cream isn’t the only industry tied to the sun that is increasing; Sun Care products are also on the rise. According to The Global Industry Analysts, Inc. the Sun Care Products industry is projected to reach $11.1 billion by 2020. Currently the largest market worldwide is Europe. The biggest upcoming markets are Asia Pacific, United States, and Latin America. Latin America has a projected sales growth of 10.2% through 2020.

These are just two industry examples of growing markets. It’s not just about the right market at the right time though, it’s also about how consumers are finding and buying.

Website Traffic Summer Blues?

While road traffic is getting increasingly worse every year (in congested cities it is growing by 15 minutes on the daily commute every year) your website traffic may be slowing down these summer months. With recent retail trends like the Retail Meltdown of 2017, there are a lot of opportunities to act upon with online shopping. More importantly, companies need to make sure they are capitalizing on their approach to mobile shopping. It will have a large impact on purchasing trends moving forward.

Not only are consumers online and mobile driven but also they are more informed. Be ready for customers who do all their prep work before larger purchases online as well as competitive shopping. Gone are the days where poor customer service can be swept under the rug with online reviews. There is increasing importance on product experts to provide product comparisons laid out for consumers.

Summer Job Money

Not all industries have a summer rain cloud over them. Overall consumer product purchases may be down but restaurants and travel are booming.

Hotel occupancy is booming. Domestic airlines have flown more passengers each year since 2010, and last year U.S. airlines set a record, with 823 million passengers. The rise of restaurants is even more dramatic. Since 2005, sales at “food services and drinking places” have grown twice as fast as all other retail spending. In 2016, for the first time ever, Americans spent more money in restaurants and bars than at grocery stores. according to The Atlantic.

We’ll see dramatic shifts moving forward in all industries, for some companies it will be sink or swim.

Planes, Trains, and Automobiles

On the other side of traffic gridlock issues, there is currently a booming tourism industry, making it important to know this year travel trends. There are seven that TrekkSoft points out for a successful tourism business: Millennials, Active & Adventure Trips, Female Solo Travel, Food Tourism, Responsible Tourism, Mobile Photography, and Business and Leisure Travel (“bleisure”).

Millennials are “officially the largest generation in history, beating out Baby Boomers.” They also have disposable income and love to travel. Expect them to share their opinions on travel online, and to do their travel research online.

Also, expect travel to be driven by how ‘Instagram worthy’ it is. Whether a destination is notable for its natural beauty or hotel beauty, expect popularity to grow through mobile pictures.

Happy summer, safe travels, and stay cool!

3ds-Max-2020

3ds Max 2020.1 Feature Updates

Autodesk 3ds Max 2020.1 is focused on enhancing your workflow by delivering you modernized tools such as keyboard shortcut management and detachable viewports. Plus, building upon our 2020 release chamfer modifier feedback, we have polished and provided an additional update to bring you a more predictable tool.

Detachable Viewports

Enhance your workspace experience by extending your viewports across various monitors.

Watch as 3ds Max team member Ken Larue takes you through the details:

Highlights:

  • Leverage multi-monitor set-up and float up to three additional viewport panels across multiple monitors
  • Each viewport panel can be separately configured

Updated default keyboard shortcuts and improved hotkey management

Easily customize, merge, and visualize shortcuts keys with a new hotkey system and hotkey editor tool.

Watch as 3ds Max team member Brent Scannell takes you through the details:

Highlights:

  • New Hotkey Editor tool replaces the Keyboard tab in the Customize User Interface dialog
  • Search for action by keywords or by current hotkey assignments
  • Clear current assignments and conflicts with undo history
  • Filter actions by current customization status, and by groups
  • Migrate legacy keyboard shortcut files without missing out on updates to the defaults
  • User hotkey settings are saved separately from defaults in a dedicated and accessible User Settings folder
  • Configuration selector allows easy and quick switching between hotkey sets

Chamfer modifier improvement

Building upon our 2020 Update users’ feedback, receive even more efficient and predictable modeling options.

Watch as 3ds Max team member Martin Coven takes you through the details:

Highlights:

  • By Weight chamfer type provides absolute weight on an edge, altering the shape of the mitered corners
  • Scale spinner globally multiplies the Weight values in the scale amount
  • Crease Weight averages between connecting edges provide a varying width over a span of edges
  • Depth Weighting allows control on a per-edge basis
  • Depth Type combobox features Fixed and By Weight which hides the modifiers spinner and fetches per-edge depth values from the mesh channel.
  • Depth spinner has been added to Edit/able Poly in the Edit Edges Rollout under Crease and Ribbon in the Edge Panel.
  • Radius Bias will alter shape of the chamfer in areas where there are acute corners, making them more uniform.
  • Spinner blends the chamfer size to the radius amount to handle sharp corners.

Additional Improvements

New Double-click selection allows you to be efficient and faster while modeling.

– Support added for EditablePoly, EditPoly modifier, EditableSpline and EditSpline modifier.​​

– Works with Unwrap UVW in both UV Editor and 3d viewports.​

– Includes modifier hotkeys to add selection or subtract selection and works in Maya Interaction mode​

– Objects to select all contiguous faces, verts, segments, etc.

Command Panel Improvements allow you to switch between different Edit Poly modes 70% faster

– Autocomplete for MAXScript is on by default in the MAXScript editor and works out-of-the-box.

– The new generateAPIList <stringstream> MAXScript API makes it easier to enable
autocomplete for MAXScript in 3rd party editors.

analytic-trends-2019

Top 10 Analytics Trends for 2019

2019 is the year that analytics technology starts delivering what users have been dreaming about for over forty years — easy, natural access to reliable business information.

1. Machine learning everywhere. We’ve reached the third great wave of analytics, after semantic-layer business intelligence platforms in the 90s and data discovery in the 2000s. Augmented analytics platforms based on cloud technology and machine learning are breaking down the longest-standing barriers to analytics success. They bring insights to users rather than forcing users to unearth elusive trends, and provide more intuitive interfaces that make it easier to get the data people need to do their jobs.

2. Embedded analytics accelerates. The historical line between operational applications and analytics continues to blur. Thanks to advances in machine learning, prescriptive “intelligent applications” have become a reality. These data-driven, self-learning business processes improve automatically over time and as people use them.

3. Cloud analytics adoption skyrockets. Cloud brings agility and faster innovation to analytics. As business applications move to the cloud, and external data becomes more important, cloud analytics becomes a natural part of enterprise architectures.

The advantages are particularly important for smaller organizations: the cloud offers affordable, on-demand access to analytical and data processing power that was previously reserved for much larger organizations with dedicated analytics teams.

However, some data will never move to the cloud — a nuanced approach is required, leveraging existing analytics investments while moving to hybrid on-premise / cloud analytics architectures.

4. Better user experience drives greater adoption. Advances in speech and text recognition mean users can finally ask business questions using everyday language. AI-assisted data discovery can automatically mine data for insights and propose appropriate views of what’s new, exceptional, or different.

Chat bots and personal assistants provide seamless access to the basic numbers used to run the business. And using real-time systems as a foundation, managers finally get dashboards with all the information they need to run every aspect of the business, in real time, at their fingertips.

5. Compliance drives true data platform adoption, supported by more flexible data management. As it has been for the last forty years, data collection, preparation, and standardization remain the most challenging aspects of analytics. The rise of compliance and privacy concerns are driving the adoption of more standardized approaches — for example, reducing the attractiveness of data discovery architectures that extract and manipulate data separately from core systems.

Real-time processing, data catalogues and new “data orchestration” systems allow organizations to retain a coherent view of data across the organization without having to physically store it in a single place.

6. Data literacy will continue to be a big problem. The biggest barrier to analytics success has never been technology. Giving somebody the best pencil in the world will not make them Picasso.

Analytics culture, skills, and organization continue to be the biggest barriers to turning information into lower costs or increased profits. Organizations must invest as much time and money in analytic skills and incentives as they do in technology.

7. There will be an increasing number of AI fails. Like any powerful technology, AI brings new dangers. Algorithms are sociopaths: they have no knowledge of what they are doing. AI brings amazing opportunities for improved productivity and augmented human intelligence. But it magnifies any existing problems with data quality and data bias and poses unprecedented challenges to privacy and ethics.

Comprehensive governance and data transparency policies are essential. Sadly, things will probably get worse before they get better — organizations must implement ethics processes, councils and external advisors before high-profile disasters hit the headlines.

8. End-to-end decision-making platforms emerge. Analytics has traditionally only paid attention to a small part of the end-to-end “data journey.” Information and insight is useless unless something actually changes in the business.

More holistic, end-to-end approaches to analytics are emerging, not only because of the combination of operations with analytics, but also with more proactive and seamless approaches to converting user analysis into concrete actions. This depends fundamentally on human judgement, consensus, and creativity, but it must be supported by better integration between analytics, traditional business planning activities, and social collaboration platforms.

9. AI and machine learning makes analytics more human. More powerful augmented analytics will eliminate a lot of the work around collecting and processing data, and identifying areas for further investigation. But ultimately people are the most important “technology” required to turn data into business improvement.

In an era where basic decisions can be increasingly automated, more strategic choices rely on uniquely human skills such as creativity, understanding of context, and leadership.

10. New experience analytics. Understanding and optimizing the customer experience is the bedrock of successful digital transformation. Traditional analytics focused on structured data flowing from operational systems. Newer analytic platforms have blended more unstructured data such as text, images, and raw sensor readings into analytic workflows.

The next step is to expand analytics to both operational data and “experience data” — the unique, subjective experiences of individuals as they interact with products, brands, and internal business processes.

Looking forward to 2020 and beyond. With ever-more devices capturing more nuanced data, with technology capabilities accelerating, and powerful machine learning still in its infancy, analytics is poised for a new golden age.

key-management-policy-

Understanding Key Management Policy – Part 2

In the first part of this two-part series on Key Management, we saw how an increasing number of organizations are encrypting their sensitive data to mitigate cybersecurity risks. As covered earlier, with cybercriminals getting more sophisticated, merely encrypting data is not sufficient.

With data encryption, the risk is transferred from the data to the encryption keys and to ensure optimal data protection, organizations should make sure that their encryption keys are efficiently managed and safeguarded at each stage of their lifecycle.

In this part, we will cover the various benefits of centralizing your key management and guide you on how to adopt key management for your organization.

Centralized Key Management

When it comes to securely storing the encryption keys, three pertinent questions should be addressed:

1. Where are the keys stored – in third-party applications, in the cloud (private, public or hybrid?), in a heterogeneous environment that supports multiple databases?

2. Are the keys protected with strong access management mechanisms that prevent unauthorised access?

3. Is your approach to key security compliant with the statutory mandates of the regulatory bodies?

As more and more data gets encrypted, the dependence on encryption keys increases and safeguarding all the keys (throughout their entire lifecycle) becomes challenging. The task becomes more daunting in an environment where organizations use diverse vendor systems that generate their own keys.

Further, as encryption keys undergo a lot of changes throughout their lifecycle – like creation, key versioning, distribution, rotation, storage, archival, backup, and ultimately destruction, managing the keys at each juncture of their lifecycle becomes critical.

This is where centralized key management comes handy. With the inherent ability to safely store and manage all the encryption keys centrally in a secure and efficient manner, organizations can uniformly view, control, and administer the encryption keys for all their sensitive data – whether it resides in the cloud, in storage, in databases, or virtually anywhere else.

Leading Key Management Solutions (KMSs) can seamlessly manage keys across heterogeneous encryption platforms and offer extensive support for the Key Management Interoperability Protocol (KMIP) standard, as well as for proprietary interfaces, managing a disparate set of encryption keys becomes easier.

Apart from secure storage and management, another important aspect of centralized key management is key governance. Merely storing and managing the keys is not sufficient but ensuring foolproof access management is equally important. Centralized key management enables proper key governance – even when the data and people move from department to department within the organization.

Requisites for Effective Centralized Key Management

Now that we understand why organizations should adopt centralized key management to ensure optimal data protection, let’s look at the three important requisites for centralized key management to work smoothly:

1. Key Management Server

At the heart of any good Key Management Solution is a FIPS 140-2, Level 3-certified intrusion-resistant, tamper-proof hardware server (also known as a Hardware Security Module or HSM) that plays the important role of creating, storing, retrieving, rotating, archiving and deleting the encryption keys.

This server also facilitates seamless communication with all other applications (both internal as well as external) through native encryption using the Key Management Interoperability Protocol (KMIP).

Below are three important points that organizations should consider while selecting a key management server:

(1) Adherence to Regulatory Compliances

The server must comply with federal security requirements that mandate the destruction of all the stored encryption keys upon detection of a forced entry.

(2) Role Management

The server should have in-built role management features that provide separation of duties between various user roles with handy tools to quickly assign/delete roles. As more and more data gets encrypted leading to an increasing dependence on encryption keys, role management becomes a crucial feature for any organization.

(3) Interoperability

The server should be able to coherently interoperate with other business applications by providing access to its user interface through APIs, web services and encryption connectors.

As a best practice, organizations should:

(a) Store all encryption keys (and not just the Root of Trust Master Key) in the hardware server.

(b) Ensure that the autorotation and versioning of keys take place as per a pre-defined schedule without any downtime during the key rotation process, and

(c) Ensure that the whitelisting of the IP address happens within the secure hardware server itself.

2. Key Management Policies

As seen in our previous post, a key management policy (KMP) is a pre-defined set of rules that cover the goals, responsibilities, and overall requirements for securing and managing an organization’s encryption keys.

While a key management server can centrally manage all the encryption keys and enforce set policies, it cannot create a KMP on its own. The onus of chalking out a comprehensive KMP lies with the organization’s Cybersecurity & IT Heads, like the Chief Information Security Officer (CISO), Chief Risk Officer (CRO), etc. who are responsible for ensuring the adoption of KMPs for data protection. ‘Unambiguity’ is one of the most important pillars of a good KMP that makes sure that there are no misinterpretations whatsoever while accessing the encryption keys. For example, a KMP can unequivocally state that the employees of one business unit or department cannot access the encryption keys of another unit, or that access to the keys can be granted only through the corporate LAN.

3. Key Management Processes

Key management processes are a host of diverse processes like inputs, activities, and outputs that are pivotal to centralized key management.

These processes help users in using their organization’s KMP and can be automated or implemented manually. For example, depending on the sensitivity of the data to be accessed, the Key Management Process may instruct users to either connect through a VPN or through the corporate LAN.

3. Key Management Processes

As the global leader in enterprise key management, Gemalto’s SafeNet KeySecure is widely adopted by organizations across the globe to centralize manage their encryption keys.

Available as a hardware appliance or virtual security appliance, SafeNet KeySecure is a plug-and-play, secure centralized key management platform that can be quickly deployed in physical, virtualized infrastructure and public cloud environments.

Holistically supporting data encryption and key management of a diverse set of databases like Oracle, IBM DB2, Microsoft SQL, Mongo DB, etc., SafeNet KeySecure also seamlessly supports the generation, storage and exporting of keys in a Bring-Your-Own-Key (BYOK) environment from cloud players like Microsoft Azure, Amazon Web Services, etc.

Below is a quick snapshot of the diverse integrations ecosystem that Gemalto’s SafeNet KeySecure supports:

For organizations that have already invested in HSM devices, Gemalto offers a cost-friendly Virtual Key Management Solution – SafeNet Virtual KeySecure that centralizes all cryptographic processing and provides scalable key management at remote facilities or cloud infrastructures such as VMware or AWS Marketplace.

To Sum It Up

With rising incidents of cyber attacks and data breaches, neither front line defense mechanisms suffice, nor does mere data encryption. To safeguard sensitive data, organizations should not only secure their encryption keys from unauthorized access, but also efficiently manage them centrally through a state-of-the-art, highly scalable key management solution. Learn more about Enterprise Key Management and how it can help your organization efficiently manage your encryption keys.

key-management-policy-1

Understanding Key Management Policy – Part 1

With rising incidents of data breaches, organisations across the globe are realising that merely implementing perimeter defense systems no longer suffice to thwart cyber attacks.

While front line defense mechanisms like firewalls, anti-theft, anti-spyware, etc. definitely act as a strong deterrent against cyber attacks, they are rendered useless when a hacker gains inside entry by exploiting their vulnerabilities to bypass them.

Alarmed by a spike in data breaches, many regulations like the Payment Card Industry Data Security Standard (PCI DSS), UIDAI’s Aadhaar circulars, RBI’s Gopal Krishna Committee Report and the upcoming Personal Data Protection Bill in India now urge organisations to encrypt their customers’ personal data.

This has resulted in an increasing number of organisations adopting data encryption as their last line of defense in the eventuality of a cyber attack. Unfortunately, with cybercriminals getting smarter and more sophisticated with every passing day, merely encrypting data is no longer the proverbial silver bullet to prevent data breaches.

In this two-part blog series, we will deep dive into the concept of (encryption) key management and cover the pivotal role a well-defined Key Management Policy (KMP) plays in data protection.

Let’s first begin with the basics!

Types of Encryption (Crypto) Keys

Crypto keys can be broadly categorised in two types – ‘symmetric keys’ and ‘asymmetric keys’.

In symmetric key encryption, the cryptographic algorithm uses a single (i.e. same) key for both encryption and decryption. Contrastingly, in asymmetric key encryption, the algorithm uses two different (but related) keys for encryption and decryption. These keys are known as ‘public keys’ and ‘private keys’.

While the public key is used for data encryption, the private key is used for data decryption. Since any data encrypted with the public key cannot be decrypted without using the corresponding private key, ensuring optimal security of the private keys is crucial for foolproof data protection.

Key Management

Since crypto keys pass through multiple phases during their lifetime – like generation, registration, distribution, rotation, archival, backup, revocation and destruction, securely managing these keys at each phase is very important.

Effective key management means protecting the crypto keys from loss, corruption and unauthorised access.

Challenges to Key Management

As more and more organisations generate thousands of crypto keys today for a diverse and disparate set of encryption-dependent systems spread across multiple businesses and geographical locations, key management becomes a big challenge.

To ensure that crypto keys do not fall in the wrong hands, a common practice followed by many organisations is to store these keys separately in FIPS-certified Hardware Security Modules (HSMs) that are in-built with stringent access controls and robust audit trail mechanisms.

However, with organisations using a diverse set of HSM devices like Payment HSMs for processing financial transactions, General Purpose HSMs for common cryptographic operations, etc., key management woes intensify. Further, merely storing the keys separately in HSM devices is not sufficient, as apart from secure storage, efficient management of the crypto keys at every phase of their lifecycle is very important.

Some of the other key management challenges that organisations face include using the correct methodologies to update system certificates and keys before they expire and dealing with proprietary issues when keeping a track of crypto updates on legacy systems.

Hence, cybersecurity experts recommend that organisations centralise the management of their crypto keys, consolidate their disparate HSM systems and chalk out a comprehensive KMP that provides clear guidelines for effective key management.

Key Management Policy (KMP)

While most organisations have comprehensive Information Security and Cybersecurity policies, very few have a documented Key Management Policy.

A well-defined KMP firmly establishes a set of rules that cover the goals, responsibilities, and overall requirements for securing and managing crypto keys at an organisational level.

Designed to cohesively cover each stage of a key’s lifecycle, a robust KMP should protect the key’s:

1. Confidentiality
2. Integrity
3. Availability, and
4. Source Authentication.

The KMP should also cover all the cryptographic mechanisms and protocols that can be utilised by the organisation’s key management system.

Last, but not least, a good KMP should remain consistent and must align with the organisation’s other macro-level policies. For example, if an organisation’s information security policy mandates that electronically transmitted information should be securely stored for a period of 7-10 years, the KMP should be able to easily align to such a mandate.

To Sum It Up

Data encryption is no longer sufficient to prevent data breaches and merely storing the crypto keys separately no longer guarantees foolproof protection against sophisticated cyber attacks.

The need of the hour is to safeguard the keys at each phase of their lifecycle, manage them centrally and implement a robust KMP to ensure optimal data protection.

In the next part, we will discuss how organisations can leverage Key Management Interoperability Protocol (KMIP) to manage their encryption keys and how Gemalto’s Key Management Platform can help to streamline their key management centrally.

In the meantime, familiarize yourself with our Key Management Platform, and learn how security teams can uniformly view, control, and administer cryptographic policies and keys for all their sensitive data—whether it resides in the cloud, in storage, in databases, or virtually anywhere else.

mcafee cloud workload security

McAfee Cloud Workload Security

As corporate data centers evolve, more workloads are migrated to cloud environments every day. Most organizations have a hybrid environment with a mixture of on-premises and cloud workloads, including containers, which are constantly in flux. This introduces a security challenge as cloud environments (private and public) require new approaches and tools for protection. Organizations need central visibility of all cloud workloads with complete defense against the risk of misconfiguration, malware, and data breaches.

McAfee® Cloud Workload Security (McAfee® CWS) automates the discovery and defense of elastic workloads and containers to eliminate blind spots, deliver advanced threat defense, and simplify multicloud management. McAfee provides protection that makes it possible for a single, automated policy to effectively secure your workloads as they transition through your virtual private, public, and multicloud environments, enabling operational excellence for your cybersecurity teams.

Modern Workload Security: Use Cases

Automated discovery

Unmanaged workload instances and Docker containers create gaps in security management and can give attackers the foothold they need to infiltrate your organization. McAfee CWS discovers elastic workload instances and Docker containers across Amazon Web Services (AWS), Microsoft Azure, OpenStack, and VMware environments. It also continuously monitors for new instances. You gain a centralized and complete view across environments and eliminate operational and security blind spots that lead to risk exposure.

Gaining insights into network traffic

By utilizing native network traffic provided from the cloud workloads, McAfee CWS is able to augment and apply intelligence from McAfee® Global Threat Intelligence (McAfee® GTI) data feeds. The enriched information is able to display properties such as risk score, geo-location, and other important network information. This information can be used to create automated remediation actions to protect workloads.

Integration into deployment frameworks

McAfee CWS creates deployment scripts to allow the automatic deployment and management of the McAfee® agent to cloud workloads. These scripts allow integration into tools such as Chef, Puppet, and other DevOps frameworks for deployment of the McAfee agent to workloads running by cloud providers, such as AWS and Microsoft Azure.

Consolidate events

McAfee CWS allows organizations to use a single interface to manage numerous countermeasure technologies for both on-premises and cloud environments. This also includes integration into additional technologies, like AWS GuardDuty, McAfee® Policy Auditor, and McAfee® Network Security Platform.

  • Administrators can leverage the continuous monitoring and unauthorized behaviors identified by AWS GuardDuty, providing yet another level of threat visibility. This integration allows McAfee CWS customers to view GuardDuty events, which include network connections, port probes, and DNS requests for EC2 instances, directly within the McAfee CWS console.
  • McAfee Policy Auditor performs agent-based checks against known or user-defined configuration audits for compliance such as Health Insurance Portability and Accountability Act (HIPAA), Payment Card Industry Data Security Standard (PCI-DSS), Center for Internet Security Benchmark (CIS Benchmark), or other industry standards. McAfee CWS reports any failed audits for instant visibility into misconfiguration for workloads in the cloud.
  • McAfee Network Security Platform is another cloud security platform that performs network inspection for traffic in hybrid as well as AWS and Microsoft Azure environments. It performs deeper packet-level inspections against network traffic, and it reports any discrepancies or alerts through McAfee CWS. This provides single-pane visibility against multicloud environments for remediation.

Enforcement of network security group policies

McAfee CWS permits users and administrators to create baseline security group policies and audit the policies that are running on the workloads against these baselines. Any deviations or changes from the baseline can create an alert in the McAfee CWS console for remediation. Administrators also can manually configure native network security groups from McAfee CWS, which enables them to directly control cloud-native security group policies.

What Sets McAfee Cloud Workload Security Apart: Key Features
and Technologies

Cloud-native build support

Using McAfee CWS, customers can consolidate management of multiple public and private clouds in a single management console, including AWS EC2, Microsoft Azure Virtual Machines, OpenStack, and VMware Vcenter. McAfee CWS can import and allow customers to run in the cloud with new cloud-native build support for Amazon Elastic Container Service for Kubernetes (Amazon EKS) and Microsoft Azure Kubernetes Service (AKS).

Simple, centralized management

A single console provides consistent security policy and centralized management in multicloud environments across servers, virtual servers, and cloud workloads. Administrators can also create multiple role-based permissions in McAfee® ePolicy Orchestrator® (McAfee ePO™) software, enabling them to define user roles more specifically and appropriately.

Network visualization with microsegmentation

Cloud-native network visualization, prioritized risk alerting, and micro-segmentation capabilities deliver awareness and control to prevent lateral attack progression within virtualized environments and from external malicious sources. Single-click shutdown or quarantine capability help alleviate the potential for configuration errors and increases the efficiency of remediation.

Superior virtualization security

McAfee CWS suite protects your private cloud virtual machines from malware using McAfee® Management for Optimized Virtual Environments AntiVirus (McAfee® MOVE AntiVirus). And it does this without straining underlying resources or requiring additional operating costs. McAfee MOVE AntiVirus allows organizations to offload security to dedicated virtual machines for optimized scanning of their virtualized environment.

Users gain anti-malware protection via McAfee® Endpoint Security for Servers. This solution can intelligently schedule resource-intensive tasks, such as on-demand scanning, to avoid impact to critical business processes.

Tag and automate workload security

Assign the right policies to all workloads automatically with the ability to import AWS and Microsoft Azure tag information into McAfee ePO software and assign policies based on those tags. Existing AWS and Microsoft
Azure tags synchronize with McAfee ePO software tags so they’re automatically managed.

Auto-remediation

The user defines McAfee ePO software policies. If McAfee CWS finds a system that is not protected by the McAfee ePO software security policies, and it is found to contain a malware or virus, this system will automatically be quarantined.

Adaptive threat protection

McAfee CWS integrates comprehensive countermeasures, including machine learning, application containment, virtual machine-optimized anti-malware, whitelisting, file integrity monitoring, and micro-segmentation that protect your workloads from threats like ransomware and targeted attacks. McAfee® Advanced Threat Protection defeats sophisticated attacks that have never been encountered before
by applying machine learning techniques to convict malicious payloads based on their code attributes and behavior.

Application control

Application whitelisting prevents both known and unknown attacks by allowing only trusted applications to run while blocking any unauthorized payloads. McAfee® Application Control provides dynamic protection based on local and global threat intelligence, as well as the ability to keep systems up to date, without disabling security features.

File integrity monitoring (FIM)

McAfee® File Integrity Monitoring continuously monitors to ensure your system files and directories have not been compromised by malware, hackers, or malicious insiders. Comprehensive audit details provide information about how files on server workloads are changing and alert you to the presence of an active attack.

What Sets McAfee Cloud Workload Security Apart: Key Features
and Technologies

McAfee CWS ensures that you maintain the highest quality of security while taking advantage of the cloud. It covers multiple protection technologies, simplifies security management, and prevents cyberthreats from impacting your business—so you can focus on growing it. Below is a feature comparison of the available package options.

mcafee five ways to rethink your endpoint protection strategy

Five Ways to Rethink Your Endpoint Protection Strategy

Device security is no longer about traditional antivirus versus next-generation endpoint protection. The truth is you need a layered and integrated defense that protects your entire digital terrain and all types of devices—traditional and nontraditional. ESG Senior Principal Analyst Jon Oltsik frames it this way: “… endpoint security should no longer be defined as antivirus software. No disrespect to tried-and-true AV, but endpoint security now spans a continuum that includes advanced prevention technologies, endpoint security controls, and advanced detection/response tools.”

In today’s survival of the fitte st landscape , he re are five ways to not just survive , but thrive:

1. More tools do not make for a better defense.

Scrambling to adapt to the evolving landscape, many security teams have resorted to bolting on the latest “best-of-breed” point solutions. While each solution may bring a new capability to the table, it’s important to look at your overall ecosystem and how these different defenses work together.

There are serious shortfalls in deploying disparate, multivendor endpoint security technologies that don’t collaborate with each other. Because point solutions have limited visibility and see only what they can see, the burden of connecting the dots falls on you. Adversaries are quick to take advantage of the windows of opportunity these manual processes create, evading defenses or slipping through the cracks unnoticed.

2. It’s not about any one type of countermeasure.

As a never-ending array of “next-generation” solutions started to emerge and flood the marketplace, you were likely told more than once that antivirus isn’t enough and what you need to do is switch to next-gen. In reality, it’s not about achieving a next-generation approach or finding the best use for antivirus. It’s really about implementing a holistic device security strategy that connects and coordinates an array of defenses. This includes signature-based defense (which eliminates 50% of the attack noise—allowing algorithmic approaches to run more aggressively with less false alarms), plus exploit protection, reputations, machine learning, ongoing behavioral analytics, and roll-back remediation to reverse the effects of ransomware and other threats.

Each device type has its own security needs and capabilities. You need to be able to augment built-in device security with the right combination of advanced protection technologies. The key to being resilient is to deliver inclusive, intelligently layered countermeasures— and antivirus is a tool that has its place in with benefits and limitations just like all countermeasures do in this unified, layered approach to device security.

3. All devices are not created equal.

Today, “endpoint” has taken on a whole new meaning. The term now encompasses traditional servers, PCs, laptops mobile devices (both BYOD and corporate- issued), cloud environments, and IoT devices like printers, scanners, point-of-sale handhelds, and even wearables.

Adversaries don’t just target one type of device—they launch organized campaigns across your entire environment to establish a foothold and then move laterally. It’s important to harness the defenses built into modern devices while extending their overall posture with advanced capabilities. Some endpoints, like Internet of Things (IoT) devices, lack built-in protection and will need a full-stack defense. Ultimately, the goal is to not duplicate anything and not leave anything exposed.

4. All you need is a single management console.

If you’ve been deploying bolted-on endpoint security technologies or several new, next-generation solutions, you may be seeing that each solution typically comes with its own management console. Learning and juggling multiple consoles can overtax your already stretched- thin security team and make them less effective, as they are unable to see your entire environment and the security posture of all your devices in one place. But it doesn’t have to be this way. Practitioners can more quickly glean the insights they need to act when they can view all the policies, alerts, and raw data from a centralized, single-pane-of-glass console.

5. Mobile devices are among the most vulnerable.

Mobile devices are an easy target for attackers and provide a doorway to corporate networks. We’re seeing more app-based attacks, targeted network-based attacks, and direct device attacks that take advantage of low-level footholds. For this reason, it’s essential to include mobile devices in your security strategy and protect them as you would any other endpoint.