Daily Tech Digest - June 26, 2018

Is It Time To Say Goodbye To Email?


Nearly all organizations that try to get rid of email start internally by switching to a cloud-based collaborative system that allows employees to chat, correspond and work together virtually. Some companies have even resorted to an automatic response when an internal email is sent reminding the sender that the email won’t be responded to and that they need to use the collaboration software instead. ... The happy medium between getting rid of email completely and keeping it as is is to modify it in someway. If the future of work relies on new technology and collaboration, it makes sense to imagine the next generation of email serving a similar purpose to pagers in the 1990s. If someone posts on the office’s collaborative system, sends a calendar invite, or tags you in a post, you could get an alert in your email that directs you to the correct system for the information. In the forward-thinking view of email, the purpose is to notify and direct instead of provide all the information. This system would seem to work better internally, but could also have success across organizations.



North American, UK, Asian regulators press EU on data privacy exemption

It also narrows an exemption for cross-border personal data transfers made in the “public interest” by imposing new conditions, including extra privacy safeguards, on its use, said the officials and legal experts. Under the previous law, regulators used the exemption to share vital information, such as bank and trading account data, to advance probes into a range of misconduct. For now, regulators are operating on the basis they can continue sharing such data under the new exemption but say doing so takes them into legally ambiguous territory because the new law’s language leaves room for interpretation. They fear that without explicit guidance, investigations such as current U.S. probes into cryptocurrency fraud and market manipulation in which many actors are based overseas, could be at risk. This is because in the absence of an exemption, cross-border information sharing could be challenged on the grounds that some countries’ privacy safeguards fall short of those now offered by the EU.


5 reasons the IoT needs its own networks

5 reasons the IoT needs its own networks
Despite having to be built from scratch, these new IoT networks can offer much less expensive service. T-Mobile, for example, offers a $6-a-year rate plan for machines on its new NB-IoT network. The company claims that’s 1/10 the cost of a similar Verizon service plan, but even $60 a year is far less expensive than a standard cellular connection. Just as important, the low-power devices that use these networks are much less expensive than standard LTE devices like mobile phones. As AT&T put it in a press release last year, "We can now reach new places and connect new things at a price that's more affordable than ever before.” ... Efficient use of scarce, expensive radio-frequency spectrum is the third reason dedicated IoT networks make sense. Both NB-IoT and LTE-M can be deployed in a very small slice spectrum channel compared to 4G deployments. NB-IoT can even be deployed in so-called LTE spectrum "guard bands" that sit between LTE channels to prevent interference. That means NB-IoT communications do not share spectrum resources with standard smartphone traffic, for example.


Tales from the Crypt(ography) Lab with Dr. Kristin Lauter

So that might sound like it’s in the stone age when we think of how fast technology evolves these days. But typically, for public key crypto systems over the last 40, 50 years, there have been roughly at least a 10-year time lag before crypto technologies get adopted. And that’s because the community needs to have time to think about how hard these problems are, and to set the security levels appropriately, and to standardize the technologies. So, we’re just getting to that point now where, kind of, almost 10 years after the first solutions were introduced, we’ve put together a broad coalition of researchers in industry, government and academia, to come together to try to standardize this technology. And we’re having our second workshop in March at MIT, where we’re going to try to get broad approval for our standard document, which recommends security parameters. So that’s the first challenge, is getting everyone to agree on what is the strength of these systems, kind of, essentially, how hard are these mathematical problems underneath, and then we plan to continue to build on that with this community, to get agreement on a common set of APIs


Ethical Data Science Is Good Data Science


When you work with 3rd parties, where your data is “better together,” should you share it all? No. This means enforcing fine-grained controls on your data. Not just coarse-grained role-based access control (RBAC), but down to the column and row level of your data, based on user attributes and purpose (more on that below). You need to employ techniques such as column masking, row redaction, limiting to an appropriate percentage of the data, and even better, differential privacy to ensure data anonymization. In almost all cases, your data scientists will thank you for it. It provides accelerated, compliant access to data and with that a great deal of comfort, freedom, and collaboration that comes when everyone knows they are compliant in what they are doing and can share work more freely. This freedom to access and share data comes when data controls are enforced at the data layer consistently and dynamically across all users. It provides the strong foundation needed to enable a high performing data science team.


Function Platforms with Chad Arimura and Matt Stephenson

“Serverless” is a word used to describe functions that get deployed and run without the developer having to manage the infrastructure explicitly. Instead of creating a server, installing the dependencies, and executing your code, the developer just provides the code to the serverless API, and the serverless system takes care of the server creation, the installation, and the execution. Serverless was first offered with the AWS Lambda service, but has since been offered by other cloud providers. There have also been numerous open source serverless systems. On SE Daily, we have done episodes about OpenWhisk, Fission, and Kubeless. All of these are built on the Kubernetes container management system. Kubernetes is an open-source tool used to build and manage infrastructure, so it is a useful building block for higher level systems.


Serverless cloud computing is the next big thing

Serverless cloud computing is the next big thing
Serverless computing in the cloud is a good idea—serverless computing is not just for the datacenter. Serverless cloud computing means the ability to get out of the business of provisioning cloud-based servers, such as storage and compute, to support your workloads, and instead use autiation at the cloud provider to allocate and deallocate resources automatically. ... We’re witnessing a reengineering of public cloud services to use a serverless approach. First, we’re seeing resource-intensive services such as compute, storage, and databases, but you can count on the higher-end cloud services being added to the list over time, including machine learning and analytics. What this all means for the enterprise is that less work will be needed to figure out how to size workloads. This serverless trend should also provide better utilization and efficiency, which should lower costs over time. Still, be careful: I’ve seen the use of serverless computing lead to higher costs in some instances. So be sure to monitor closely. There is clearly a need for serverless cloud computing.


Latin American banks advance in digital transformation projects

In terms of consumer technology trends, mobile banking in the region has surpassed both online banking and traditional channels and has become the number one channel for banks today, the report says. Regional Internet banking client uptake is at 67 percent, compared to 79 percent in 2015, while mobile applications rose to 33 percent. Millennials appear to be the most important target segment for digital banking, followed by premium clients and "native", digital customers, according to the study. When it comes to enterprise technology trends, the report notes that over 60 percent of Latin American banks are implementing or testing cloud computing, chatbots and Big Data, while a minority (less than 22 percent) mentions Blockchain, Internet of Things and virtual reality. Some 13 percent of the banks surveyed mentioned they have plans to invest in a new core banking platform in the next year while 7 percent are updating their core system. While 70 percent of those polled consider other banks with better digital capabilities as the main threat, nine out of 10 banks surveyed consider fintechs as potential partners or acquisitions.


Why Intel won't patch TLBleed vulnerability, despite serious concerns for cloud users

lockcyber.jpg
Maybe Intel has solutions with less overhead. But Intel excluded us from conversation so we don't know what those solutions might be. So we follow a pattern of immediately releasing a rough solution, which we can retract if a cheaper solution becomes published." Intel's position on this is somewhat peculiar, as the company has indicated that existing mitigations are sufficient to prevent this issue, and has declined to request a CVE to identify the flaw, as is standard. The Register report also indicates that Intel has declined to pay a bug bounty for this discovery via HackerOne, which is within the scope of the requirements Intel lists as being a side-channel attack, which Gras indicated to The Register as "goalpost-moving." Exploitation of, and patches for, TLBleed are likely to be more technically involved than the OpenBSD strategy of disabling SMT entirely, as ensuring that schedulers do not place processes of different security levels in the same core is a significant undertaking. 


The C4 Model for Software Architecture


Ambiguous software architecture diagrams lead to misunderstanding, which can slow a good team down. In our industry, we really should be striving to create better software architecture diagrams. After years of building software myself and working with teams around the world, I've created something I call the "C4 model". C4 stands for context, containers, components, and code — a set of hierarchical diagrams that you can use to describe your software architecture at different zoom levels, each useful for different audiences. Think of it as Google Maps for your code. ... Level 2, a container diagram, zooms into the software system, and shows the containers (applications, data stores, microservices, etc.) that make up that software system. Technology decisions are also a key part of this diagram. Below is a sample container diagram for the Internet banking system. It shows that the Internet banking system (the dashed box) is made up of five containers: a server-side web application, a client-side single-page application, a mobile app, a server-side API application, and a database.



Quote for the day:


"When you practice leadership,The evidence of quality of your leadership, is known from the type of leaders that emerge out of your leadership" -- Sujit Lalwani


No comments:

Post a Comment