With hybrid work becoming the norm, the mapping technology to build and manage workplace digital twins could also make it easier for startups to enter the market. New businesses that would otherwise need to invest in corporate real estate can achieve virtual flexibility at a lower cost. Because real-time mapping affords visualization of indoor assets, managers of airports or hospitals, for instance, can view multiple floors, entrances, stairwells and rooms to watch what's happening and where. We will likely see crossover in how this in-the-moment tracking of equipment and resources plays out in the metaverse and in the real world. ... While the metaverse will likely represent an avenue of escape and entertainment for many, there's the potential for it to be a valuable business tool with the capability to offer real-world simulations. It's something one consultant has been doing on such a scale as to mimic the effects of global warming and show how it will disrupt businesses and entire cities. Experiencing one's own replicated neighborhood relative to rising seas, encroaching storms and more, offers a visceral, relatable experience more likely to motivate action.
On a high level, Infrastructure-as-Data tools like VMware’s Idem and Ansible, and Infrastructure-as-Code, dominated by Terraform, were created to help DevOps teams achieve their goals of simplifying and automating application deployments across multicloud and different environments, while helping to reduce manual configurations and processes. ... When cloud architectures need to be expressed using code, “you’re just writing more and more and more and more Terraform,” he said. “Idem is different from how you generally think of Infrastructure as Code — everything boils down to these predictable datasets.” “Instead of sitting down and saying, ‘I’m going to write out a cloud in Terraform,’ you can point Idem towards your cloud, and it will automatically generate all of the data and all of the code and the runtimes to enforce it in its current state.” At the same time, Idem, as well as Ansible to a certain extent, were designed to make cloud provisioning more automated and simple to manage.
It is necessary to understand operating systems and networks principles at all levels: File storage, access management, log files policies, security policies, protocols used to share information between computers, et cetera. The core concepts, components and conventions associated with cyberdefense and cybersecurity should be identified, and a strong knowledge of industry best practices and frameworks is mandatory. Another core tenet is how defensive approaches and technology align to at least one of the five cyber defense phases: Identify, protect, detect, respond and recover. Key concepts to know here are identity and access management and control, network segmentation, cryptography use cases, firewalls, endpoint detection and response. signature and behavior based detections, threat hunting and incident response, and red and purple teams. One should develop a business continuity plan, disaster recovery plan and incident response plan. ... This part is all about understanding the role and responsibilities of everyone involved: Reverse engineers, security operation center analysts, security architects, IT support and helpdesk members, red/blue/purple teams, chief privacy officers and more.
Teams Toolkit for Visual Studio, Visual Studio Code, and command-line interface (CLI) are tools for building Teams and Microsoft 365 apps, fast. Whether you’re new to Teams platform or a seasoned developer, Teams Toolkit is the best way to create, build, debug, test, and deploy apps. Today we are excited to announce the Teams Toolkit for Visual Studio Code and CLI is now generally available (GA). Developers can start with scenario-based code scaffolds for notification and command-and-response bots, automate upgrades to the latest Teams SDK version, and debug apps directly to Outlook and Office. ... Microsoft 365 App Compliance Program is designed to evaluate and showcase the trustworthiness of application-based industry standards, such as SOC 2, PCI DSS, and ISO 27001 for security, privacy, and data handling practices. We are announcing the preview of the App Compliance Automation Tool for Microsoft 365 for applications built on Azure to help them accelerate the compliance journey of their apps.
In the modern IT landscape, service development has moved toward an API-first and spec-first approach. IT environments are also becoming increasingly distributed. After all, organizations are no longer on-premises or even cloud-only, but working with hybrid cloud and multicloud environments. And their teams are physically distributed, too. Therefore, points of integration must be able to span various types of environments. The move toward microservices is fundamentally at odds with the traditional, monolithic ESB. By breaking down the ESB monolith into multiple focused services, you can retain many of the ESB’s advantages while increasing flexibility and agility. ... As API standards have matured, the API gateway can be leaner than an ESB, focused specifically on cross-cutting concerns. Additionally, the API gateway is focused primarily on client-service communication, rather than on all service-to-service communication. This specificity of scope allows API gateways to avoid scope creep, keeping them from becoming yet another monolith that needs to be broken down. When selecting an API gateway, it is important to find a product with a clear identity rather than an extensive feature set.
Inventions generated by AI challenge the patent system in a new way because the issue is about ‘who’ did the inventing, rather than ‘what’ was invented. The first and most pressing question that patent registration offices have faced with such inventions has been whether the inventor has to be human. If not, one fear is that AIs might soon be so prolific that their inventions could overwhelm the patent system with applications. Another challenge is even more fundamental. An ‘inventive step’ occurs when an invention is deemed ‘non-obvious’ to a ‘person skilled in the art’. This notional person has the average level of skill and general knowledge of an ordinary expert in the relevant technical field. If a patent examiner concludes that the invention would not have been obvious to this hypothetical person, the invention is a step closer to being patented. But if AIs become more knowledgeable and skilled than all people in a field, it is unclear how a human patent examiner could assess whether an AI’s invention was obvious. An AI system built to review all information published about an area of technology before it invents would possess a much larger body of knowledge than any human could.
The SIM card has a lot going for it. SIM cards use the same highly secure, cryptographic microchip technology that is built into every credit card. It's difficult to clone or tamper with, and there is a SIM card in every mobile phone – so every one of your users already has this hardware in their pocket. The combination of the mobile phone number with its associated SIM card identity (the IMSI) is a combination that's difficult to phish as it's a silent authentication check. The user experience is superior too. Mobile networks routinely perform silent checks that a user's SIM card matches their phone number in order to let them send messages, make calls, and use data – ensuring real-time authentication without requiring a login. Until recently, it wasn't possible for businesses to program the authentication infrastructure of a mobile network into an app as easily as any other code. tru.ID makes network authentication available to everyone. ... Moreover, with no extra input from the user, there's no attack vector for malicious actors: SIM-based authentication is invisible, so there's no credentials or codes to steal, intercept or misuse.
The realization that current data architectures can no longer support the needs of modern businesses is driving the need for new data engines designed from scratch to keep up with metadata growth. But as developers begin to look under the hood of the data engine, they are faced with the challenge of enabling greater scale without the usual impact of compromising storage performance, agility and cost-effectiveness. This calls for a new architecture to underpin a new generation of data engines that can effectively handle the tsunami of metadata and still make sure that applications can have fast access to metadata. Next-generation data engines could be a key enabler of emerging use cases characterized by data-intensive workloads that require unprecedented levels of scale and performance. For example, implementing an appropriate data infrastructure to store and manage IoT data is critical for the success of smart city initiatives. This infrastructure must be scalable enough to handle the ever-increasing influx of metadata coming from traffic management, security, smart lighting, waste management and many other systems without sacrificing performance.
“As GDPR races to retrofit new legislative ‘add ons’ that most technology companies will have evolved well beyond by the time they’re implemented, GDPR is barely an afterthought for marketing professionals who are readying themselves for a much more seismic change this year: the crumbling of third-party cookies,” he explained. “Because of that, advertisers will require new, privacy-respecting, non-tracking-based approaches to reach their target audiences. Now, then, is the time for businesses to establish what a value exchange between users and an ad-funded, free internet actually looks like – but that goes far beyond the remit of GDPR. To increase focus on privacy in commercial settings, McDermott believes that major stakeholders such as Google need to “lead the charge” and collaborate when it comes to establishing a best practice on data capture. “For the smaller businesses,” he added, “it’ll be about forming an allegiance with bigger technology companies who have the resources to navigate these changes so they can chart a course together.”
Organizations increasingly suffer from a lack of visibility, drown in threat intelligence overload, and suffer due to inadequate tools. This means they struggle to discover, classify, prioritize, and manage internet-facing assets, which leaves them vulnerable to attack and incapable of defending their organization proactively. As attack surfaces expand, organizations can’t afford to limit their efforts to just identify, discover, and monitor. They must improve their security management by adding continuous testing and validation. More can and should be done to make EASM solutions more effective and reduce the number of tools teams need to manage. Solutions must also blend legacy EASM with vulnerability management and threat intelligence. This more comprehensive approach addresses business and IT risk from a single solution. When vendors integrate threat intelligence and vulnerability management in an EASM solution, in addition to enabling lines of business within the organization to assign risk scores based on business value, the value increases exponentially.
Quote for the day:
"The greatest good you can do for another is not just share your riches, but reveal to them their own." -- Benjamin Disraeli