Daily Tech Digest - November 27, 2019

10 Predictions How AI Will Improve Cybersecurity In 2020

10 Predictions How AI Will Improve Cybersecurity In 2020
Nicko van Someren, Ph.D. and Chief Technology Officer at Absolute Software, observes that “Keeping machines up to date is an IT management job, but it's a security outcome. Knowing what devices should be on my network is an IT management problem, but it has a security outcome. And knowing what's going on and what processes are running and what's consuming network bandwidth is an IT management problem, but it's a security outcome. I don't see these as distinct activities so much as seeing them as multiple facets of the same problem space, accelerating in 2020 as more enterprises choose greater resiliency to secure endpoints.” ... Josh Johnston, Director of AI at Kount, predicts that “the average consumer will realize that passwords are not providing enough account protection and that every account they have is vulnerable. Captcha won’t be reliable either, because while it can tell if someone is a bot, it can’t confirm that the person attempting to log in is the account holder. AI can recognize a returning user. AI will be key in protecting the entire customer journey, from account creation to account takeover, to a payment transaction. ...”


hero-image.jpg
Wolfram Language has limitations, and has been described by some users as better suited to solving a wide range of predetermined tasks, rather than being used to build software. It also seems there is still a way to go for Wolfram Language – it didn't, for example, feature in the IEEE's recent list of top programming languages. Wolfram has said that Wolfram Language is not just a language for telling computers what to do, but a way for both computers and humans to represent computational ways of thinking about things. Of late Wolfram has been more bold in how he talks about Wolfram Language, describing it as a "computational language" that could even help bridge the gulf between ourselves and future non-human intelligences, be they artificial intelligence (AI) or extraterrestrial. As esoteric a pursuit as it might seem, Wolfram believes the need for this lingua franca is timely, as machine-learning systems increasingly make decisions about our lives -- whether that's screening loan applications today or maybe even choosing whether to kill people tomorrow.


Tech jobs: These are the skills hiring managers are looking for now


CompTIA noted that the technology workforce, in particular, has been under the microscope for its lack of diversity. Diversity in tech staffing is likely to improve due to continuing pressure, the association said, but "fully diverse and inclusive environments still lie further in the future". A wide range of research and anecdotal examples proves that there's still much work to do in achieving equity, from data on wage gaps to the makeup of executive teams to ongoing reports of abusive behaviour, CompTIA said. Although 30% of companies feel that there has been significant improvement in the diversity of the tech workforce over the past two years, previous CompTIA research shows that "sentiment tends to skew more positive than reality on this topic." "The trend may be heading in the right direction, but the chasm was so wide that it will take significant time and intentional changes to close," said CompTIA, noting that there is a long list of potential actions that could improve the situation. Flexible work arrangements, including the physical environment, can create more opportunities and a more welcoming atmosphere, especially if there is a hard look at how the existing arrangements unintentionally create barriers, the association said.


AI Is The Link Between Big Data & Persons-Level Measurement

To highlight the shortcomings of big data from a measurement perspective, we conducted an analysis in the U.S. earlier this year that compared set-top box data with set-top box data that we calibrated with Nielsen panel data. The analysis found that the uncalibrated data is inherently biased and underrepresents minority audiences. That’s not to say, however, that big data has no value. Quite the opposite. But it does need to be grounded in a foundational truth set. That’s where our panels and artificial intelligence (AI) come into play. Our panel data—the key to persons-level measurement—is the perfect truth set for training big data. Through the application of AI, we use big data to dramatically broaden our measurement capabilities while preserving quality and representativeness. Today, AI is integral in our measurement methodologies. For example, it played a pivotal role in the development of our enhanced measurement capabilities for local TV markets, which combines the scale of big data (return path data {RPD} from TV sets) with fully representative in-market panel data.


GDPR Data Regulations & Commercial Fines


The public and private sector are both impacted, although government agencies have more leeway across GDPR in general due to requirements to retain and use data to deliver services to citizens. In terms of what best practice should be in dealing with a request, the advice from the UK’s Information Commissioner’s Office is that there should be a policy for recording all “subject access requests” and that based on Recital 59 of the GDPR, organisations “provide means for requests to be made electronically, especially where personal data are processed by electronic means.” This process will start with an access request form but when it comes to identity, the guidance is unclear. A number of organisations are asking for a similar set of documents that most banks require to open an account which includes a “proof of identity” such as a passport, photo driving license or birth certificate along with a “proof of address” such as a utility bill, bank statement or credit card statement. This requirement to verify from copies or scans of electronic documents is a major weakness in this process. 


Non-functional
Simply said, a non-functional requirement is a specification that describes the system’s operation capabilities and constraints that enhance its functionality. These may be speed, security, reliability, etc. We’ve already covered different types of software requirements, but this time we’ll focus on non-functional ones, and how to approach and document them. If you’ve ever dealt with non-functional requirements, you may know that different sources and guides use different terminology. For instance, the ISO/IEC 25000 standards framework defines non-functional requirements as system quality and software quality requirements. BABOK, one of the main knowledge sources for business analysts, suggests the term non-functional requirements (NFR), which is currently the most common definition. Nevertheless, these designations consider the same type of matter – the requirements that describe operational qualities rather than a behavior of the product. The list of them also varies depending on the source.


The Road to 2030 Must Be Circular


What gets exciting, is when you can find the perfect material match in someone else’s waste. Carbon fiber is a great example. Turns out computers use a similar grade carbon fiber as airplanes. So we reclaim aerospace material for Latitude, our commercial notebook line. To date, Dell has prevented more than 2 million pounds of carbon fiber from ending up in landfills. And in this case, the benefits go far beyond the environment. We’ve partnered with Carbon Conversions, a start-up based in South Carolina with a mission to reclaim and recycle carbon fiber. Carbon Conversions has redesigned and reengineered the papermaking process to produce carbon fiber non-woven fabrics, bringing new growth to an area historically impacted by overseas manufacturing. Finding more partners like Carbon Conversions will be important. It will also be important to increase our own recycling streams dramatically (i.e. you all have a role to play too). We must make it as easy as possible for you to recycle.


Bringing Business and IT Together, Part II: Organizational Alignment

COA is similar to other continuous improvement processes such as continuous quality improvement (CQI) and continuous process improvement (CPI). Just as CQI and CPI demand structure and metrics, so too does COA. Continuous improvement is evolutionary and incremental. It is manageable only when understood as a set of interconnected components that can be identified and measured. The COA Framework illustrated in Figure 1 provides the necessary structure. This three-dimensional structure associates the core elements of COA – those of organizational alignment and working relationships – with the activities of continuous improvement. The framework identifies the components that can be managed, measured, and modified to improve the overall alignment of business and technology organizations. ... Organization-to-organization relations are ideally structured and business-like. Conversely, person-to-person relationships are best when unstructured and friendly. Team-to-team relationships seek a balance between the two extremes.


VMware doubles up on Kubernetes play


Many of our large customers have Kubernetes clusters on vSphere, Amazon EC2 and sometimes bare metal. These are managed by different teams, making it difficult to manage and control everything. That was a problem we wanted to solve. Then comes the next question on how we can help customers build and deploy new applications. Historically, we’ve relied on Pivotal as a partner to help customers modernise their applications. While Pivotal Cloud Foundry is a great platform, Pivotal last year decided to use Kubernetes as the default runtime for their developer platform. Meanwhile, Spring Boot was becoming the de facto way by which people built microservices. So, we felt that by bringing Pivotal into the family, we could offer a very comprehensive solution to help customers build, run and manage their modern applications.


Using Kanban with Overbård to Manage Development of Red Hat JBoss EAP

Red Hat JBoss EAP (Enterprise Application Platform) has become a very complex product. As a result, planning EAP releases is also increasingly complicated. In one extreme case of the team working on the next major release while developing features for the previous minor release, the planning for that major release was ongoing for 14 months with the requirements constantly changing. However, spending more effort on planning didn't improve the end result; it didn't make us any smarter or more accurate. We'd rather spend more time doing stuff rather than talking about it. That was a major problem. In addition, there were cases in which requirements could be misunderstood or miscommunicated and we found that out late in the cycle. We had to find a way to collectively iterate over a requirement and make sure everyone understood what was to be done. In some cases we could go as far as implementing a proof-of-concept before we would be certain we fully understood the problem and the proposed solution.



Quote for the day:


"Inspired leaders move a business beyond problems into opportunities." -- Dr. Abraham Zaleznik


No comments:

Post a Comment