Furlonger drilled down into the problem with today's technology as an enabler for the programmable economy. "How many of you are still using systems that were designed in the 1960s or 1970s or 1980s, where the speed of movement of information, the ability to access information, is very different than it is today? … You're trying to create apps and you're trying to link APIs with systems that clearly weren't designed for that. You're trying to move across ecosystems between one country and another or between one supply chain and another supply chain. Your systems just won't accommodate that kind of flexibility."
“The new arrangement will provide stronger obligations on companies in the U.S. to protect the personal data of Europeans and stronger monitoring and enforcement by the U.S. Department of Commerce and Federal Trade Commission (FTC), including through increased cooperation with European Data Protection Authorities. The new arrangement includes commitments by the U.S. that possibilities under U.S. law for public authorities to access personal data transferred under the new arrangement will be subject to clear conditions, limitations and oversight, preventing generalised access. Europeans will have the possibility to raise any enquiry or complaint in this context with a dedicated new Ombudsperson.”
The suite allows customers to access security data collected via a portal, report potential threats, and garner real-time information into an event as it happens. The portal will provide threat profiles presented for summing up at a glance. Analytics can be applied to the information available, tickets created, activity tracked, and final resolution reported to the customer. The suite has a security information and event management (SIEM) back-end that can import client-specific vulnerability data from any location and a variety of SIEM tools. Analytics can be applied to the information for feedback on an ongoing activity. The SIEM information handling system can accept information from the customer’s existing security tools and systems, Kelleher said in the announcement.
“A variety of skills and backgrounds in data science are interesting to us,” McMahon explains. “Publishing a paper on data science is an excellent way to stand out as a candidate,” she adds. Microsoft regularly sends recruiters to the Strata conference to meet professionals in the field. While a computer science degree is helpful, it is not required. McMahon has seen professionals from bio-informatics or other informatics fields make the transition to Microsoft. Even those without formal training in data science or informatics can launch a new career with a little passion and persistence. “Self-study and passion for data science are key qualities in data science professionals,” says Jeremy Stanley, vice president of data science at Instacart, an e-commerce company that arranges personalized grocery shopping and delivery services.
Startups, big banks and database companies say that blockchain, the software behind bitcoin, could forever alter financial market infrastructure. How and when banks and other financial companies adopt blockchain technology hinges largely on regulation, most yet unwritten. In the highly regulated banking industry, clear rules help companies build products to both meet demand and quickly gain regulators’ approval. A group of banks successfully tested a private blockchain in January, renewing focus on its potential. Federal banking regulators are likely years away from formalizing blockchain regulation, fueling uncertainty for banks that want to use the technology to handle transactions more efficiently. Investments in blockchain-related startups increased through 2015, when private-equity firms, banks and payments processors invested in companies such as Digital Currency Group and Chain.
Responsibility for this disturbing lack of preparation, PwC claims, comes from the top, because "many boards are not sufficiently proactive regarding cyber threats, and generally do not understand their organisation's digital footprint well enough to properly assess the risks". Indeed, fewer than half of board members are said to actually request information about how their organisation is prepared, when it comes to fending off or dealing with a cyberattack. It also seems that the board is willing to pass the buck when it comes to taking responsibility for dealing with a "cyber crisis", with IT security staff expected to deal with outcomes in almost three-quarters of cases. That strategy, argues Andrew Gordon, global leader of forensic services at PwC, is not the right course of action.
The findings indicate that a majority of the mobile app development work will still be done outside of IT, with two-thirds of apps being developed by business application vendors, systems integrators, digital agency partners and developer partners. Only 35% of mobile apps will be developed by IT staff. However, IT staff can expect to do more work on mobile apps in the future. According to the findings, professional developers in IT spent 43% of their time on internal app projects in 2015, but over the next two years they can expect to spend 63% on their time on internal app projects -- a 20% jump.
Upcoming release of International Roadmap for Devices and Systems, the biennial forecast of the future progress in processors, will for the first time not be centered on Moore’s Law. The physical limit of how small process technology can get is now very well within our sight. Judging by chipmaker predictions, once five-nanometer process technology arrives, sometime around 2021, the physics that governs the way chips behave today will no longer apply. At that scale, we enter the unpredictable realm of quantum mechanics, and it’s unclear which way technological progress will turn at that point. Will we finally get viable quantum computers, or will engineers and scientists focus on optimizing other elements of the computing systems?
The technology is a strong choice for enterprises that have a growing expectation of flexibility and faster results. There’s no vendor lock-in and the associated costs are lower than proprietary solutions. But while open source throws open immense possibilities, beware of its biggest challenge: assuring security, access control and governance of the data lake. There is also the risk that a poorly managed data lake will end up as an aggregate of data silos in one place. CIOs must caution teams about the need to train lay users in appreciating key nuances – contextual bias in data capture, incomplete nature of datasets, ways to merge and reconcile different data sources, and so on – which is a herculean task in every way.
Due to the many open source code reuse scenarios active in enterprise DevOps shops, you will want to continually monitor code for newly registered vulnerabilities. Enterprises can view reusing existing assets as a built-in savings by not having to purchase or create new assets. In the case of reusing legacy source code, there is also a time savings, as developers want to shrink time to market for code-based projects. “Let’s say you have a company that is pulling information from a database to display in pie charts. You have a rendering component that you will reuse. When you’re starting on a new product, do you build a new rendering library from scratch or do you use the existing one?” asks Pittenger. Even if those components are open source, there can be great savings in reusing the library.
Quote for the day:
"We cannot change the cards we are dealt, just how we play the hand." -- Randy Pausch