Daily Tech Digest - April 24, 2019

Edge computing is in most industries’ future

istock 1019389496
The growth of edge computing is about to take a huge leap. Right now, companies are generating about 10% of their data outside a traditional data center or cloud. But within the next six years, that will increase to 75%, according to Gartner. That’s largely down to the need to process data emanating from devices, such as Internet of Things (IoT) sensors. Early adopters include: Manufacturers: Devices and sensors seem endemic to this industry, so it’s no surprise to see the need to find faster processing methods for the data produced. A recent Automation World survey found that 43% of manufacturers have deployed edge projects. Most popular use cases have included production/manufacturing data analysis and equipment data analytics; Retailers: Like most industries deeply affected by the need to digitize operations, retailers are being forced to innovate their customer experiences. To that end, these organizations are “investing aggressively in compute power located closer to the buyer,” writes Dave Johnson, executive vice president of the IT division at Schneider Electric. 


Why fintech is the sharing economy’s final frontier

Fintech through sharing economy applications threatens to break up the banking complex as we know it – all through the social capital and economic sharing. Evidently, this is where investors feel the sharing economy has the most potential. Fintech is already attracting huge amounts of investment and is the biggest sector in terms of Venture Capital investments. Peer-to-peer lending, equity crowdfunding, and payment possibilities are proving to be the three areas with the biggest potential for fintech through the sharing economy start to come into full focus. Firstly, equity crowdfunding helps to create a two-sided marketplace between investors and startups. This means a slice of the private company is sold for capital, typically through the sale of securities like shares, convertible note, debt, or a revenue share. This process is similar to crowdfunding or Kickstarter campaigns but with possible payouts for those willing to put their money where their mouth is. 


intel 9th gen logo
For now, Intel’s 9th-gen mobile chips are targeting the sort of dazzing, high-end gaming notebooks most of us unfortunately can’t afford or can’t bear to lug around. If that’s the case, be patient: Separately, Intel announced a metric ton of new 9th-gen desktop processors, about six months after it announced its own Core i9-9900K. Many of the more mainstream mobile 9th-gen Core chips will likely debut in late summer or fall. Until then, read on to get an idea of what to expect. ... The new 9th-gen Core chips are designed with Intel’s “300-series” mobile chipsets in mind: the Intel CM246, Intel QM370, or Intel HM370 chipsets. A key performance advantage are the x16 channels of PCI Express 3.0 directly off of the CPU, which provide enough bandwidth for an upcoming generation of third-party discrete GPUs. A full 128GB of DDR4 memory isn’t anything to sneeze at, either. The new 9th-gen mobile Core chips also ship with integrated graphics, part of what Intel calls “Generation 9.5.”


Boosting data strategies by combining cloud and infrastructure management

By receiving insights into IT operations including hardware, software, and network environments, these tools support daily operations through real-time monitoring, reduce downtime, and maintain business productivity. The current increase in integrating next generation technologies such as machine learning and artificial intelligence is positioning infrastructure management as an attractive choice for IT teams. One major benefit infrastructure management gives data center managers is the ability to monitor all aspects of IT operations. Through an intuitive platform that allows you to tap into the full value of existing infrastructure, these platforms fuel modernization through intelligent software by providing complete visibility and control over the environment. For example, companies using infrastructure management tools see up to a 40 percent increase in operating efficiencies. From these insights, operators can take control of power consumption, real-time data center health, and preventative analytics.


The FBI's RAT: Blocking Fraudulent Wire Transfers

The FBI's RAT: Blocking Fraudulent Wire Transfers
As much as it might seem like fighting internet crime is like pushing the tide with a broom, there is a bright spot in the gloom. In February 2018, the IC3 created what it terms the RAT, or Recovery Asset Team. Its goal is to contact financial institutions quickly to freeze suspicious pending wire transfers before they're final. Much internet-enabled crime eventually intersects with banking systems. So while it may be difficult to prevent scams, there is a touch point where with industrywide cooperation, stolen funds can be recovered. But time is tight, and swiftly contacting financial institutions is key to stopping stolen funds from being withdrawn.IC3 reports that the bureau's RAT group - working with what's termed the Domestic Financial Fraud Kill Chain - handled 1,061 incidents between its launch and the end of last year, covering an 11-month period. Those incidents caused losses of more than $257 million. Of that, the RAT achieved a laudable 75 percent recovery rate, or more than $192 million.


GraphQL: Core Features, Architecture, Pros, and Cons


A GraphQL server provides a client with a predefined schema — a model of the data that can be requested from the server. In other words, the schema serves as a middle ground between the client and the server while defining how to access the data. Written down in Schema Definition Language (SDL), basic components of a GraphQL schema — types — describe kinds of object that can be queried on that server and the fields they have. The schema defines what queries are allowed to be made, what types of data can be fetched, and the relationships between these types. You can create a GraphQL schema and build an interface around it with any programming language. Having the schema before querying, a client can validate their query against it to make sure the server will be able to respond to the query. While the shape of a GraphQL query closely matches the result, you can predict what will be returned. This eliminates such unwelcome surprises as unavailable data or a wrong structure.


Why composable infrastructure goes hand in hand with private cloud


"With composable infrastructure, I can have a physical server dynamically provisioned in the size and shape I need and have it all stitched together. I'm actually getting those physical server assets to use," said Mike Matchett, storage analyst and founder of Small World Big Data, an IT consultancy near Boston. The composable model differs from converged and hyper-converged systems. Converged infrastructure is sold as racks of prequalified hardware from multiple vendors and includes systems such as NetApp FlexPod and Dell EMC VxBlock. Hyper-converged systems ship all the necessary components -- servers, hypervisor software, network connectivity and storage -- delivered as an integrated appliance. While swapping in composable modules sounds appealing, Matchett said enterprises should methodically estimate the financial cost before taking the plunge. "Without a significant churn rate for the resources, I'm not sure composable makes much fiscal sense," he said. The cost made sense for Clearsense, based in Jacksonville, Fla.


Dark Side of Offshore Software Development

Time is money, especially if you are working against the clock to get your product on the market before the competition, as most startups do. A week’s delay may not kill your business, but few projects can afford to lose a month or more. However, offshore development is not the reason behind missed deadlines. In-house teams can fail to finish projects on time, and the expenses will be even higher. ... One of the problems of outsourcing is that even if the project progresses on schedule, it is a hostage to an offshore vendor. Agile development may provide you with interim results and functionality, but you won’t receive the full package until the vendor delivers it. If you are in a hurry to get the product on the market, you are ready to give the shirt off your back. Unscrupulous companies are willing to risk reputation and future business to squeeze you dry.


What Is Explainable Artificial Intelligence and Is It Needed?

What Is Explainable Artificial Intelligence and Is It Needed?
It is aimed to explain the reasons for new machine/deep learning systems, to determine their strengths and weaknesses and to understand how to behave in the future. The strategy to achieve this goal is to develop new or modified artificial learning techniques that will produce more definable models. These models are intended to be combined with state-of-the-art human-computer interactive interface techniques, which can convert models into understandable and useful explanation dialogs for the end user. ... “XAI is one of a handful of current DARPA programs expected to enable -the third-wave AI systems- where machines understand the context and environment in which they operate, and over time build underlying explanatory models that allow them to characterize real-world phenomena.” If we set out from medical practice, after examining the patient data, both the physician should understand and explain to the patient that he proposed to the concerned patient the risk of a heart attack on the recommendation of the decision support system.


How AI could save the environment

istock-925065874.jpg
Google has also used its own AI expertise to improve its energy efficiency as a company—leveraging DeepMind's machine learning capabilities, it reduced the amount of energy needed to cool its data centers by 40%. "AI is most helpful when the possible solution to a problem resides in large, highly dimensional datasets," Pucell said. "If you think about climate data, there's a wealth of traditional structured data about temperature, sea levels, emissions levels, etc. But there's also a lot of unstructured data in the form of images, video, audio, and text. When it comes to analyzing massive amounts of unstructured data, deep learning is really the only game in town." At USC, Dilkina's research group has used AI to develop optimization methods for wildlife conservation planning—an area where highly limited budgets need to be allocated to protect the most ecologically effective land, she said. Her team has also used machine learning and game theory to help protected areas fight the poaching of endangered animals, including elephants and rhinos.



Quote for the day:


"Nobody in your organization will be able to sustain a level of motivation higher than you have as their leader." -- Danny Cox


No comments:

Post a Comment