Julia is a free open source, high-level, high-performance, dynamic programming language for numerical computing. It has the development convenience of a dynamic language with the performance of a compiled statically typed language, thanks in part to a JIT-compiler based on LLVM that generates native machine code, and in part to a design that implements type stability through specialization via multiple dispatch, which makes it easy to compile to efficient code. ... What we’re seeing here is that Julia code can be faster than C for a few kinds of operations, and no more than a few times slower than C for others. Compare that to, say, R, which can be almost 1,000 times slower than C for some operations. Note that one of the slowest tests for Julia is Fibonacci recursion; that is because Julia currently lacks tail recursion optimization. ... Julia fans claim, variously, that it has the ease of use of Python, R, or even Matlab. These comparisons do bear scrutiny, as the Julia language is elegant, powerful, and oriented towards scientific computing, and the libraries supply a broad range of advanced programming functionality.
Using machine learning to understand the human immune system
The human immune system works on multiples large enough to make your head spin. There are two billion lymphocytes in the body, among them what's known as 'helper' T cells, others as 'cytotoxic' or 'killer' T cells. Each T cell can recognise the antigens -- the triggers that will set off the immune system -- that are the signatures of bacteria, viruses, fungi or other invaders that have entered the body. Each T cell can bind to hundreds of different antigens, each potentially unique to a different bacteria or virus. Once a T cell has got a hit, depending on what type of T cell it is, it may kill the invader, or signal the millions of other immune cells to come and take on the wrongdoer too. Anyone taking a snapshot of the immune system when the T cells are activated, by noting which T cell receptors are activated and which antigens they bind to, could work out which disease has taken over the body. And, once the disease is known, doctors can see more clearly how it can be treated.
Why PGP is fundamentally flawed and needs to be fixed
From my vantage point, the biggest problem with encryption (outside of these vulnerabilities) is the fact that few people actually use it. Sure those that are really, really concerned about privacy and security will make use of PGP in some form or function, but the average user (of which there are hundreds of millions) wouldn't know PGP if it reached out from their monitor and slapped them across the face to say, "Pay attention to me!" There's a reason for this. The average user doesn't know where to begin working with encryption ... on any level. Try talking your parents through the usage of encryption in email. Watch their faces go slack as every word you utter flies over their heads. The thing is, if PGP (or OpenPG, or GPG, or GnuPG ... you get the idea) is to succeed, it needs to be used (and not by the enlightened few). Encryption needs to become the standard. For that to happen, it needs to be built into email clients such that users require zero hand-holding to understand or use the technology. But before that can happen, the implementations of PGP need to be fixed.
Storage class memory set to upset hardware architectures
The more expensive and lower capacity devices offer the capability to be used as a host or array cache that add the benefit of persistence compared with simply using DRAM. The extended endurance of these products compared with NAND flash also makes them more suited for write caching or as an active data tier. Hyper-converged infrastructure (HCI) solutions can take advantage of low latency persistent memory deployed into each host. Placing the persistent storage on the PCIe or even the memory bus will significantly reduce I/O latency. But this also risks exposing inefficiencies in the storage stack, so suppliers will want to be quick to identify and fix any issues. Disaggregated HCI solutions, such as those from Datrium and NetApp, should see large performance improvements. In both cases, the architecture is built on shared storage with local cache in each host. Performance is already high with NAND flash, but offers more resiliency (and less cache warm-up) with persistent caching using products such as Optane.
Data readiness is much more than mere cleanliness
Data set readiness comprises traditional data preparation ideas: data cleanliness and consistency, de-duplication and the management of unstructured data. (The seemingly simple task of mailing address normalization is a data preparation discipline in its own right.) In the world of the V’s – variety, volume, velocity, veracity, and even validity and volatility – the biggest challenge here is variety. Since data sets evolve over time as domain experts look for new insights and correlation with new data sources, some agility in the ability to acquire and integrate new data sets is a part of data set readiness, albeit in the “meta” sort of way where being ready to get more data ready is a prerequisite. Data pipeline readiness addresses some of the larger big data V’s: volume and velocity. Once you have models to execute, operationalizing them to operate reliably at scale and at business speed brings an entirely new set of challenges. Can your business handle the massive data flows? Can it handle them in an increasingly expeditious way?
Companies are struggling with security automation
“The cybercrime landscape is incredibly vast, organized and automated – cybercriminals have deep pockets and no rules, so they set the bar,” said Amy James, Director of Security Portfolio Marketing at Juniper Networks. “Organizations need to level the playing field. You simply cannot have manual security solutions and expect to successfully battle cybercriminals, much less get ahead of their next moves. Automation is crucial.” The growing threat landscape and security skills gap facing cybersecurity teams demand that organizations implement automation for a stronger security posture. Respondents recognize this growing importance and how automation can improve productivity, address the growing volume of threats and reduce the rate of false positives. The top two benefits of security automation, according to respondents, are: increased productivity of security personnel (64 percent) and automated correlation of threat behavior to address the volume of threats (60 percent). Half or 54 percent of respondents say these automation technologies simplify the process of detecting and responding to cyber threats and vulnerabilities.
Will blockchain bring data ownership back to users?
Blockchain, a technology that has seen success in cryptocurrency and beyond through its security, efficiency and non-centralized control, has been seen as a way of democratizing data and putting ownership back into the hands of users. As compared to the current practices where ownership of user data is held by the enterprise, blockchain would enable the creation of a self-sovereign identity, where individuals control their own identities and personal data and are able to decide who to share it with, and to what extent. In addition, blockchain offers the possibility of micro-incentivizing people to share data at their own will, which can significantly disrupt current ways of working for industries such as advertising and content. Organizations will need to come to terms with this new reality and be aligned with the changing mindsets and desires of their users when it comes to management of personal data. While a self-sovereign identity that is enabled by blockchain could revolutionize how personal data is managed, it does not come about without hurdles. For starters, the burden of managing and allocating access would have to be borne by the individual.
IT Mission Vision & Values Statements: Foundations For Success
The difference between a vision statement and a mission statement can be confusing. Some enterprise vision statements are actually missions and vice versa. A good vision paints a picture of a desired future state. It appeals to the heart inspiring employees, customers, and other stakeholders to do their best. A good vision rarely changes, remaining constant through different leaders, economic circumstances, and challenges. A mission describes how the enterprise will get to the desired future state. It appeals to the head and is an anchor against which departments and programs can be measured to determine how well they support the enterprise. Missions evolve to reflect new challenges as intermediate goals are attained. ... Value statements describe the principles, beliefs and operating philosophy that shape culture. Strong values serve as a moral compass, guiding interactions among employees and providing a standard against which behaviors can be assessed. Passion, teamwork, integrity, diversity and quality are found in many enterprise value statements.
Slack outage causes disruption, but highlights importance of team chat
“For individuals and organizations using team collaboration tools such as Slack, real-time communications have become ubiquitous and second nature to their work,” said Raúl Castañón-Martínez, a senior analyst at 451 Research. High expectations for service availability, he added, mean that every downtime incident will be perceived as a “serious disruption.” “On the positive side, this signals that Slack has been successful in permeating the enterprise, and team collaboration tools are rapidly becoming a core productivity tool, alongside email and calendar,” he said. Slack did not provide Computerworld with further details of the cause of the outage, or the number of users affected. A spokesperson said: “On June 27th between 6:33am and 9:49am PT Slack experienced an outage where people could not connect to their workspace. We worked as quickly as possible to bring Slack back to normal for everyone, and we are continuing to investigate the cause of this issue. We’re deeply sorry for the disruption. Please see the Slack Status page for the latest information
Online banks struggle to stay connected with younger mobile users
J.D. Power’s findings imply that those digital-only institutions aren’t just competing with other banks. Consumer use of other apps, like Uber or Seamless, is influencing customers’ expectations in the banking sphere as well. Direct banks earned an overall satisfaction score of 863 out of 1,000 points in J.D. Power’s latest ranking, compared with traditional retail banks’ overall score of 806. Direct banks also scored higher than their traditional counterparts for service across all other banking channels, including websites and call centers. On mobile banking, however, direct banks held the narrowest lead over traditional retail banks, earning a score of 864 compared with traditional banks’ score of 850. Last year, both direct and traditional banks scored 872 on satisfaction with mobile channels. J.D. Power also found that direct bank customers’ awareness and usage of various mobile banking features had declined year over year across every single feature the firm tracks, including bill pay and person-to-person payments.
Quote for the day:
"Learn to appreciate what you have, before time makes you appreciate what you had." -- Anonymous
No comments:
Post a Comment