The benefits of keeping open source open far outweigh its cons. Allowing developers from all backgrounds and training practices to review and modify code means that it is constantly being improved, in turn allowing the entirety of the industry to benefit from innovation, better security and healthy competition. It also gives developers greater mobility. Not only does open source software means developers are free to train and practice in any type of coding language they please, but also that these non-proprietary coding languages become increasingly popular and in-demand; granting developers flexibility in their work and career. Open source platforms — specifically those that remain true to its roots — are so valuable that it can attract hordes of venture capital funding, even if there are no immediate prospects for monetary returns. Jocelyn Goldfein, a partner at venture capital fund Zetta, pointed out: “There’s probably at least two dozen venture firms that invest a lot in open source now.” Nowadays, the default question isn’t why a platform would be open source — rather, why wouldn’t it?
How edge computing makes 5G essential
With 5G, it is really about decreasing latency and increase bandwidth, and it's being driven more by enterprise applications more so than consumer although you kind of see the consumer influences with people sitting on subways watching movies or playing video games or doing even teleconferencing while they're going to or from work. So those edge applications where the data latency matters, you can think about eMedicine or any of the mission-critical type things that are important with smart cities, you certainly don't want the power to go out while somebody is crossing the street, and your smart sensors go down. That need for the infrastructure out at the edge to be resilient, robust is a critical aspect of what's going to happen with 5G. As we start into that transformation, Vertiv, as a company, is really focused in on how can we support the critical infrastructure at the edge to ensure that capabilities are always on through battery backup, that you don't have thermal runaway in hot locations like Phoenix, or very cold locations, if it's in Alaska or wherever that might happen to be.
How Far Are We From Achieving Artificial General Intelligence?
Artificial General Intelligence (AGI) can be defined as the ability of a machine to perform any task that a human can. Although the aforementioned applications highlight the ability of AI to perform tasks with greater efficacy than humans, they are not generally intelligent, i.e., they are exceedingly good at only a single function while having zero capability to do anything else. Thus, while an AI application may be as effective as a hundred trained humans in performing one task it can lose to a five-year-old kid in competing over any other task. For instance, computer vision systems, although adept at making sense of visual information, cannot translate and apply that ability to other tasks. On the contrary, a human, although sometimes less proficient at performing these functions, can perform a broader range of functions than any of the existing AI applications of today. While an AI has to be trained in any function it needs to perform with massive volumes of training data, humans can learn with significantly fewer learning experiences. Additionally, humans -- and agents with artificial general intelligence -- can generalize better to apply the learnings from one experience to other similar experiences.
Tomorrow's Cybersecurity Analyst Is Not Who You Think
First, cybercriminals are becoming much better at penetrating organizations using nontechnical means. With social engineering and phishing techniques, they can bypass organizations' increasingly advanced defenses by manipulating insiders to gain access. Research shows that phishing and social engineering were the most common methods of compromise in 2018, serving as the conduit to the initial point of entry in more than 60% of security breaches in both cloud and point-of-sale environments, as well as in 46% of corporate and internal network breaches. Second, the volume of data in organizations is growing exponentially and is increasingly stored in a more decentralized manner, making it difficult to ensure it's being optimally protected. Research firm IDC predicts the volume of data worldwide will grow tenfold by 2025 to 163 zettabytes, with the majority being created and managed by enterprises. This growth is being driven by the proliferation of artificial intelligence, the Internet of Things, and other machine-to-machine technologies in enterprises across all industries.
Trainline On Track: Innovating And Navigating Change
Talent remains a key challenge for businesses – particularly when the knowledge and expertise required for technical roles is so vast. As Director of Engineering, Midgley is tasked with building and leading a team of 300 tech travel specialists. “The thing that I’m most passionate about is engineering culture. We really seek out the very best talent in each of our locations, and it’s always hard. The flip side is that, once these amazing people are through the door, you have to retain them and make them feel rewarded, satisfied, and highly motivated,” he says. To some extent, retaining talent is about giving people room to grow. This might include sending employees on study days and courses, or encouraging them to take part in enriching or educational activities. At the same time, this brings new learnings, approaches, and technologies into the organisation. In terms of working structures, Trainline has adopted the ‘two-pizza rule‘ favoured by Amazon, in which employees form small working groups that could, as the name suggests, be fed by two pizzas. As well as leveraging internal employees as a source of innovation and ideas, the company collaborates with hundreds of international partners.
CAD and PLM: transforming the industrial landscape and shaping how humans work
“When we marry digital with human — which is what AR is all about — we get vastly more productive workers, thanks to the ability of the digital world to monitor, control, and optimise the world of humans,” continued Heppelmann. AR is about virtualising physical infrastructure in the industrial world, with real-time information for safety and productivity. “It should act as a digital mentor.” ... Combining technological innovation with the skills of workers expands human possibilities, and it must be part of business strategy. Humans do not have the capability to shift through the vast amounts of data that factories or indeed, any businesses create. They need a way to look at the whole environment and know where to run analytics to optimise productivity. ... IoT can gather and analyse data of physical machines, which will mitigate the problem of unplanned downtime, which drives down equipment availability — in a factory setting, for example. Machine learning can then be used to predict when this problem will next occur, while AR and VR can be used by the human to view the problem up close in a digital environment; on top of the obvious safety and training benefits.
Cisco launches a developer-community cert program
Perhaps one of the biggest additions – rolled out here at the company’s Cisco Live customer event – is the new set of professional certifications for developers utilizing Cisco’s growing DevNet developer community. The Cisco Certified DevNet Associate, Specialist and Professional certifications will cover software development for applications, automation, DevOps, cloud and IoT. They will also target software developers and network engineers who develop software proficiency to develop applications and automated workflows for operational networks and infrastructure. “This certification evolution is the next step to reflect the critical skills network engineers must have to be at the leading edge of networked-enabled business disruption and delivering customer excellence,” said Mike Adams, vice president and general manager of Learning@Cisco. “To perform effectively in this new world, every IT professional needs skills that are broader, deeper and more agile than ever before. And they have to be comfortable working as a multidisciplinary team including infrastructure network engineers, DevOps and automation specialists, and software professionals.”
The Rise of 'Purple Teaming'
Part of what makes Red Teaming and Purple Teaming so valuable is they provide insight into the specific tactics and approaches that attackers might use. Deception technology can enhance this visibility by incorporating deception technology into the testing program. The first benefit comes from detecting attackers early by enticing them to engage with decoys or deception lures. The second comes from gathering full indicators of compromise (IOCs) and tactics, techniques, and procedures (TTPs) into lateral movement activity. This significantly enhances visibility into how and when attackers circumvent security controls, enriching the information that typically results from these exercises. Cyber deceptions deploy traps and lures on the network without interfering with daily operations. A basic deployment can easily be completed in under a day, providing the Blue Team an additional detection mechanism that blends in with the operational environment. This creates more opportunities to detect when the Red Team bypasses a defensive control, forcing team members to be more deliberate with their actions and making simulated attack scenarios more realistic.
Agile vs. Top-Down Management: Leadership Must Evolve as an Organization Matures
Agile was first used in software development. Its purpose then was to deliver a more relevant product to customers through smaller iterations on a shorter cycle, which gave developers the opportunity to incorporate user feedback into future releases. As other teams and departments appropriated agile, its purpose evolved into accelerating growth by reducing the time-to-value of growth initiatives like marketing while also ensuring a project is effective before ramping it up to scale. The problem with Agile at scale is that time-to-value usually isn’t dependent upon a single team’s ability to execute. Even projects that appear small involve multiple teams, and stakeholders within them are juggling competing priorities and relying upon different feedback sources to inform the direction of projects. When Agile team operates without awareness of the work in the rest of the organization, it can become misaligned from other teams. Real-time communication is necessary between teams to help with these handoff points; otherwise, it’s like teams are playing hot-potato with projects; tossing the ball to someone who may not be ready to catch it.
What is data protection by design and default
To explain how the approach works, we must first break it into its two component parts. The first is data protection by design, which ensures that organisations address information security and privacy in the planning stage of any system, service, product or process that uses personal data. With cyber attacks on the rise, a growing public interest in data privacy and the strengthened penalties introduced by the GDPR, it makes sense to prioritise information security. If you don’t, you’ll be left trying to tack security controls onto existing set-ups. This could lead to improperly implemented controls that expose vulnerabilities, and expensive restructuring projects. ... Data protection by default ensures that organisations conduct data processing activities only if they are necessary to achieve a specific goal. As such, it links to the GDPR’s principles of data minimisation and purpose limitation. One way to achieve this is to give data subjects the strongest possible privacy settings by default – hence the name. This helps prevent data being collected excessively, and gives the data subject the option to consent to more extensive data practices if they want to use other services.
Quote for the day:
"Wisdom comes from experience. Experience is often a result of lack of wisdom." -- Terry Pratchett
No comments:
Post a Comment