Daily Tech Digest - August 19, 2017

Oracle doesn't want Java EE any more

Oracle plans to explore its desire to offload Java EE with the open source community, licensees, and candidate foundations. Although Oracle has not named possible candidates, the Apache Software Foundation and the Eclipse Foundation are likely possibilities. Oracle has already donated the OpenOffice productivity suite and the NetBeans IDE to Apache, and the Hudson integration server to Eclipse. Like Java, all three technologies—OpenOffice, NetBeans, and Hudson—were acquired in Oracle’s 2010 acquisition of Sun Microsystems. Eclipse is ready to take on Java EE if chosen. “We believe that moving Java EE to a vendor-neutral open source foundation would be great for both the platform and the community,” said Eclipse Executive Director Mike Milinkovich. “If asked to so, the Eclipse Foundation would be pleased to serve as the host organization.”


Next step in the content evolution

A recent ASG-commissioned technology adoption profile study, “Today’s Enterprise Content Demands a Modern Approach” by Forrester Consulting found 95% were using more than one system to manage enterprise content, including 31% using five or more systems. This leads to disjointed information and difficult access. Lack of flexibility is therefore one clear shortcoming of existing approaches to ECM. Organisations want to invest in systems and technology that allow them to grow and adapt to changing markets but traditional ECM often hinders their progress. Further, 82% of respondents reported an increase in unstructured data in the form of business content, like office documents, presentations, spreadsheets, and rich media. They are also managing transactional content from outside the organisation. Traditional ECM systems struggle to cope with this level of growth due to another key shortcoming – their inability to scale.


How Blockchain Technology Is 'Disrupting' The Art Economy As We Know It

The technology that supports Bitcoin and other cryptocurrencies and is now being used to decentralize other industries as well. Given that the blockchain is a distributed ledger and completely secure and transparent, users are able to be connected to each other without the centralized hub of a corporation. Simply put, management has been replaced by machines. In this new decentralized world, art has been one of the first and greatest use cases. Artists who otherwise would have been forced to use a large-scale centralized company to distribute their work are now able to distribute work in a decentralized way, and to receive rewards for their creations without profit-skimming corporate structures in place. And, are there entities seeking to disrupt matters, although whether they can succeed in their endeavours is another matter.


How a data cache can solve your JavaScript performance problems

Service workers can be unpredictable. They can generate their own responses, and their response mechanism is not baked into the browser. "There are no caching semantics baked into service workers, unless the developer adds them in," Weiss said. If a service worker is not able to create a response, it uses the fetch API to look further up the stack. At the network layer, the application then checks the HTTP cache, which uses very strict caching semantics. HTTP cache is also persistent, which allows it to save resources to disk for later use. However, it is considerably slower than MemoryCache, which operates at RAM speeds. If data is not found in HTTP cache, the browser makes one last check for the Push Cache available as part of HTTP/2. But this is more complicated, since different browsers have different rules for managing Push Cache.


Demystifying AI, Machine Learning and Deep Learning

Deep learning is the name for multilayered neural networks, which are networks composed of several “hidden layers” of nodes between the input and output. There are many variations of neural networks, which you can learn more about on this neural network cheat sheet. Improved algorithms, GPUs and massively parallel processing (MPP), have given rise to networks with thousands of layers. Each node takes input data and a weight and outputs a confidence score to the nodes in the next layer, until the output layer is reached where the error of the score is calculated. With backpropagation inside of a process called gradient descent, the errors are sent back through the network again and the weights are adjusted improving the model. This process is repeated thousands of times, adjusting a model’s weights in response to the error it produces, until the error can’t be reduced any more.


Pentagon eyes bitcoin blockchain technology as cybersecurity shield

The key to blockchain’s security: Any changes made to the database are immediately sent to all users to create a secure, established record. With copies of the data in all users’ hands — even if some users are hacked — the overall database remains safe. This tamper-proof, decentralized feature has made blockchain increasingly popular beyond its original function supporting the bitcoin digital transactions. Many cutting-edge finance firms, for instance, have used blockchain to expedite processes and cut costs without compromising security. In Estonia, home of the video phone pioneer Skype, officials have reported using blockchain to track national health records. In Russia, experiments are underway to integrate blockchain into the general payment economy.


Tech breakthroughs megatrend

Collectively, those driving factors are forcing big questions to the surface - questions that C-suite executives themselves are asking. To help provide answers, we tracked more than 150 discrete technologies, and have developed a methodology to identify the most pertinent of those technologies;  ... The specific technologies most impactful to a company can - and likely will - vary, of course, but when we analysed for technologies with the most cross-industry and global impact over the coming years, eight technologies emerged. They are at varying degrees of maturity; some have been around for years but are finally hitting their stride, while others are maturing rapidly. None will be surprising to CEOs; they are regular subjects of often breathless coverage in popular newspaper coverage.


Hacker claims to have decrypted Apple's Secure Enclave

"Apple's job is to make [SEP] as secure as possible," xerub said. "It's a continuous process ... there's no actual point at which you can say 'right now it's 100% secure.'" Decrypting the SEP's firmware is huge for both security analysts and hackers. It could be possible, though xerub says it's very hard, to watch the SEP do its work and reverse engineer its process, gain access to passwords and fingerprint data, and go even further toward rendering any security relying on the SEP completely ineffective. "Decrypting the firmware itself does not equate to decrypting user data," xerub said. There's a lot of additional work that would need to go into exploiting decrypted firmware—in short it's probably not going to have a massive impact. An Apple spokesperson, who wished to remain unidentified, stated that the release of the SEP key doesn't directly compromise customer data.


Businesses need to talk about the cloud

Performance issues are a commonly cited bugbear following a cloud migration – with research finding organisations experience a problem at least once every five days. If the application in question is business critical, this could be at serious detriment to the organisation. From high network latency to application processing delays – poor cloud performance costs businesses both time and money, and greatly affects the end-user experience. But for many organisations, simply understanding where a performance issue occurs in the first place, is a challenge. In the ‘old days’ of on-premise IT infrastructure, life was simpler. Organisations could, for example, quickly identify a misbehaving server in their data centre and initiate a fix. Today, the picture is not that straightforward, particularly with the increased uptake of public cloud services, because ‘your’ server is now in someone else’s data centre.


All ‘things’ connected, the ‘I’ in the IoT – a closer look. Part three

Which technology or network type will prevail in the future is (very) hard to predict. In fact, there’s no real reason why they should be mutual exclusive, they don’t have to be. The fact that LTE networks have such a broad range globally and that they can also be used to provide NB-IOT and LTE-M networks with relative ease could oppose a threat to LPWAN networks. Especially when companies like Verizon and AT&T are the ones pushing the technology. Though the same can be said for LoRa as well, companies such as IBM and Cisco are showing immense interest, as are CSP’s like Swisscom and KPN. On the other hand, with the LTE/cellular companies focussing on the high-end market, so to speak, and the LPWAN providers focussing on the lower to mid-market range, mainly in the form of sensor based data transport, there could be room for both.



Quote for the day:


"The desire of knowledge, like the thirst for riches, increases ever with the acquisition of it." -- Laurence Sterne


No comments:

Post a Comment