Daily Tech Digest - August 02, 2017

Procurement: looking ahead to a digital future

Technology is lifting the role of procurement, providing services that are critical to maintaining corporate reputation, not to mention legality. With procurement functions frequently stretching across the globe, companies need full visibility all the way to the end of a supply chain to guarantee it is free of issues such as modern-day slavery or bribery and corruption. “Automated procurement systems can help map connections in a supply chain; for a CEO, who is legally responsible for the elimination of poor processes, that puts procurement in a different light,” says Mr Coulcher. Using technology, such as the blockchain, companies can validate supply chains, bringing a level of transparency that has not always been possible. Last autumn, BHP Billiton, the mining company, announced plans to use ethereum blockchain to improve its supply chain processes.


What Is IT Governance? A Formal Way To Align IT & Business Strategy

Organizations today are subject to many regulations governing the protection of confidential information, financial accountability, data retention and disaster recovery, among others. They're also under pressure from shareholders, stakeholders and customers. To ensure they meet internal and external requirements, many organizations implement a formal IT governance program that provides a framework of best practices and controls. Both public- and private-sector organizations need a way to ensure that their IT functions support business strategies and objectives. And a formal IT governance program should be on the radar of any organization in any industry that needs to comply with regulations related to financial and technological accountability.


Design Thinking 101

Each phase is meant to be iterative and cyclical as opposed to a strictly linear process, as depicted below. It is common to return to the two understanding phases, empathize and define, after an initial prototype is built and tested. This is because it is not until wireframes are prototyped and your ideas come to life that you are able to get a true representation of your design. For the first time, you can accurately assess if your solution really works. At this point, looping back to your user research is immensely helpful. What else do you need to know about the user in order to make decisions or to prioritize development order? What new use cases have arisen from the prototype that you didn’t previously research?


Why ex-employees may be your company's biggest cyberthreat

Ex-employees are increasingly a cybersecurity risk, Maxim noted: In June, Dutch web host Verelox experienced a major outage of all of its services after most of its servers were deleted by an ex-employee, according to the company. And in April, the US-based Allegro Microsystems sued an ex-IT administrator for allegedly installing malware that deleted critical financial data. So why don't companies take away this access immediately? For one, the process can be time consuming: 70% of IT decision makers surveyed said it can take up to an hour to deprovision all of a single former employee's corporate application accounts. For another, IT and HR do not often work together, said Al Sargent, senior director at OneLogin. "This is a problem, because HR has the single source of truth for which employees are at the company and which are not, whereas IT controls access to the applications," he added.


State of Cybercrime 2017: Security events decline, but not the impact

Companies are spending more on IT security, with an average budget increase of 7.5 percent. Ten percent of respondents reported an increase of more than 20 percent. The bulk of that money is being spent on new technologies (40 percent), but companies are paying for knowledge, too, in the form of audits and assessments (34 percent), adding new skills (33 percent), and knowledge sharing (15 percent). Respondents said they were investing in redesigning their cybersecurity strategy (25 percent) and processes (17 percent) as well. Speaking of cybersecurity strategy, an amazing 35 percent of respondents said that a cyber response plan was not part of it. The good news is that 19 percent planned to implement a plan within the next year.


Kubernetes as a service offers orchestration benefits for containers

Canonical makes Ubuntu, a leading free and open source Linux server and desktop OS distribution. The Canonical Kubernetes-as-a-service distribution is a packaged deployment that stitches together additional Canonical open source projects surrounding Kubernetes, such as Juju, an application modeling framework that uses Charm scripts to simplify Kubernetes infrastructure builds; Conjure-up orchestrates these Juju script deployments. This distribution runs on various infrastructure environments, including local workstations, bare-metal servers, AWS, Google Compute Engine, Azure, Joyent and OpenStack. Canonical partnered with Google, Kubernetes' original developer, to maintain its distribution, with the aim to simplify and standardize Kubernetes clusters on just about any conceivable environment.


Black Hat 2017: Insightful, but too much hype

The industry has become far too obsessed on the zero-day problem (i.e. zero-day exploits) and isn't paying enough attention to eliminating all the manual tasks and busy work we do as cybersecurity professionals. Oh, I agree that zero-days are a problem, but these attacks are the exception. We need to get better at bread-and-butter cybersecurity operations with improved processes, automation and orchestration. In other words, people REMAIN the weakest link of the cybersecurity chain. Addressing this problem should be a high priority for all CISOs. ... New security analytics tools are expanding and challenging SIEM platforms. Software-defined tools are pushing out tried-and-try network security controls.


Machine Learning Comes To Your Browser Via Javascript

The most prominent advantages of TensorFire’s approach are its portability and convenience. Modern web browsers run on most every operating system and hardware platform, and even low-end smartphones have generous amounts of GPU power to spare. Much of the work involved in getting useful results from machine learning models is setting up the machine learning pipeline, either to perform the training or to deliver the predictions. It is very useful to boil much of that process down to just opening up a web browser and clicking something, at least for certain classes of jobs. Another advantage claimed by TensorFire’s creators is that it allows the deployment of predictions to be done entirely on the client. This won’t be as much of an advantage where both the trained model and the data are already deployed to the cloud.


Bill Would Beef Up Security for IoT Wares Sold to US Gov't

"This bill deftly uses the power of the federal procurement market, rather than direct regulation, to encourage Internet-aware device makers to employ some basic security measures in their products," Jonathan Zittrain, co-founder of Harvard University's Berkman Klein Center for Internet and Society, said in a statement announcing the bill's introduction. Security technologist and author Bruce Schneier, also in a statement, sees the legislation as a way to motivate vendors to make the investments needed to secure their IoT offerings. "The market is not going to provide security on its own, because there is no incentive for buyers or sellers to act in anything but their self-interests," says Schneier, who also is a Harvard Kennedy School of Government fellow and lecturer.


How to use Redis for real-time stream processing

Redis has become a popular choice for such fast data ingest scenarios. A lightweight in-memory database platform, Redis achieves throughput in the millions of operations per second with sub-millisecond latencies, while drawing on minimal resources. It also offers simple implementations, enabled by its multiple data structures and functions. In this article, I will show how Redis Enterprise can solve common challenges associated with the ingestion and processing of large volumes of high velocity data. We’ll walk through three different approaches (including code) to processing a Twitter feed in real time, using Redis Pub/Sub, Redis Lists, and Redis Sorted Sets, respectively. As we’ll see, all three methods have a role to play in fast data ingestion, depending on the use case.



Quote for the day:


"The most rewarding things you do in life are often the ones that look like they cannot be done." -- Arnold Palmer


No comments:

Post a Comment