Daily Tech Digest - January 06, 2020

Deep learning vs. machine learning: Understand the differences

Deep learning vs. machine learning: Understand the differences
Dimensionality reduction is an unsupervised learning problem that asks the model to drop or combine variables that have little or no effect on the result. This is often used in combination with classification or regression. Dimensionality reduction algorithms include removing variables with many missing values, removing variables with low variance, Decision Tree, Random Forest, removing or combining variables with high correlation, Backward Feature Elimination, Forward Feature Selection, Factor Analysis, and PCA (Principal Component Analysis). Training and evaluation turn supervised learning algorithms into models by optimizing their parameter weights to find the set of values that best matches the ground truth of your data. The algorithms often rely on variants of steepest descent for their optimizers, for example stochastic gradient descent, which is essentially steepest descent performed multiple times from randomized starting points.



Why enterprises should care about DevOps


The old days of manually doing everything as an IT person are gone, and companies that are still operating that way are undergoing transformation. But I don’t think we’re ever going to get rid of operational concerns. It’s just going to be that rather than doing things manually, or through graphical consoles, you're going to work via APIs, scripting languages and automation tools like Puppet. And in many ways – and I say this quite a lot – DevOps has made operations people feel that they must become developers to get their job done. But it’s more about embracing software engineering principles. It’s about version control, release management, branching strategies, and continuous integration and delivery. We’ve seen this repeatedly, and that’s why we added features to Puppet Enterprise around continuous delivery, because the most successful customers were those that were adopting infrastructure as code.


Legal engineering: A growing trend in software development

Legal advice service concept with lawyer working for justice, law, business legislation, and paperwork expert consulting, icons with person in background
Legal engineers come from incredibly diverse backgrounds and collectively have years of experience and insights that benefit our customers tremendously. They include former attorneys from top law schools and some of the country's best law firms, experts in contract law, and a former civil rights trial attorney. We have other legal engineers who came to us from top-tier management consulting firms and several who gained considerable experience at some of Silicon Valley's best SaaS companies. These diverse backgrounds and responsibilities mean that the role of legal engineering can seem very different depending on who you ask. To our customers, they are thought partners, advising on best practices for building a modern legal team. To our product team, they are the voice of the user, listening and synthesizing valuable feedback. Sometimes, we even refer to them internally as our in-house S.W.A.T. team, because they are ready and able to jump in and help fix any situation. Ultimately, legal engineers are at the forefront of the modernization of in-house legal. As legal technology continues to evolve, so will legal engineering.


Banner: Fragmentation by Country
In this post, we look at how Fragmentation varies across the globe and key statistics you should keep in mind if you have a presence in these markets. The growth mantra of online businesses is scale — reach more users, fast. However, as you scale across countries, it’s important to ensure that your app/website is compatible with your users’ devices and browsers. Compatibility is to online businesses what distribution is to brick and mortar ones. You might have the best product in the world, but it counts for nothing if your customers don’t have the experience you designed for them. For instance, being compatible with the top 20 devices will help you cover 70% of the US audience. In India, not only will the devices be different, the coverage provided will be less than 35%. Similarly, if your mobile website doesn’t load properly in the Opera browser, you would have ignored almost half of the Nigerian market!


Industry 4.0 / Industrial IoT / Smart Factory
“This consolidation will strengthen the ability of the IIC to provide guidance and advance best practices on the uses of distributed-ledger technology across industries, and boost the commercialization of these products and services,” said 451 Research senior blockchain and DLT analyst Csilla Zsigri in a statement. Gartner vice president and analyst Al Velosa said that it’s possible the move to team up with TIoTA was driven in part by a new urgency to reach potential customers. Where other players in the IoT marketplace, like the major cloud vendors, have raked in billions of dollars in revenue, the IIoT vendors themselves haven’t been as quick to hit their sales targets. “This approach is them trying to explore new vectors for revenue that they haven’t before,” Velosa said in an interview. The IIC, whose founding members include Cisco, IBM, Intel, AT&T and GE, features 19 different working groups, covering everything from IIoT technology itself to security to marketing to strategy.



Up to half of developers work remotely; here's who's hiring them

It is estimated that there are between 18 to 21 million developers across the globe. Of this, only about one million -- or five percent -- are in the United States, so you can see how an employer in the US, or anywhere else for that matter, needs to spread its recruiting and staffing wings. It's in the best interest for tech-oriented employers, then, to be open to this global pool of talent. There are a number of companies leading the way, actively hiring globally distributed tech workforces. Glassdoor recently published a list of leading companies that encourage remote work, which includes some prominent tech companies, and Remotive has been compiling a comprehensive list of more than 2,500 companies of all sizes that hire remote IT workers. Survey data from Stack Overflow, analyzed by Itoro Ikon, finds that out of almost 89,000 developers participating in its most recent survey, 45% work remotely at least part of the time, and 10% indicated they are full-time remote workers.


The Fundamental Truth Behind Successful Development Practices: Software is Synthetic


Look across the open plan landscape of any modern software delivery organization and you will find signs of it, this way of thinking that contrasts sharply with the analytic roots of technology. Near the groves of standing desks, across from a pool of information radiators, you might see our treasured artifacts - a J-curve, a layered pyramid, a bisected board - set alongside inscriptions of productive principles. These are reminders of agile training past, testaments to the teams that still pay homage to the provided materials, having decided them worthy and made them their own. What makes these new ways of working so successful in software delivery? The answer lies in this fundamental yet uncelebrated truth - that software is synthetic. Software systems are creative compounds, emergent and generative; the product of elaborate interactions between people and technology.


5G is poised to transform manufacturing

5G mobile wireless network technology
Today, many manufacturers use as fiber, Wi-Fi and 4G LTE rather than 5G because 5G infrastructure, standards, and devices are yet to be available and proven. “But many people are starting to look at 5G today, looking at it as a more future-proof strategy than adopting 4G,” said Dan Hays, principal and head of US corporate strategy practice at PricewaterhouseCoopers LLP. “4G LTE has been around for a little over a decade.” 5G devices available today are very early ones. “They are not yet at the mass-production level and have not come down the cost curve to drive large-scale adoption,” he said. According to Erik Josefsson, vice president and head of advanced industries at Ericsson, which makes underlying 5G technology, 5G is currently at Release 15, which offers high data rate, extended coverage, and low latency compared to 4G – but doesn’t get down to the goal of 1-millisecond latency. "You can get 10 milliseconds," he said. "But you're not down to 1 millisecond yet. Release 16 is ultra-reliable low-latency, down below 10 milliseconds, for more complex use cases."


These five tech trends will dominate 2020


The constant drip-drip of data leaks and privacy catastrophes show that security is still, at best, a work-in-progress for many organisations. And security is still a minor consideration for many business leaders too.. Perhaps that's because there have been so many leaks that they think the risk to their reputation is low. It's a dangerous assumption to make. More apps and more devices mean security teams are already spread too thinly. Add in new risks like Internet of Things projects, 5G devices and deepfakes and the challenges mount unless companies take the broadest possible view of security. Organised crime and ransomware will still be the most consistent threats to most businesses; state-sponsored attacks and cyber-espionage will remain an exotic but potentially high-profile threat to a minority. For all this, the biggest risks will still be the basic ones; staff falling for phishing emails, or using their pets' names as passwords, and poorly configured cloud apps. There will always be new threats, so prepare for the strangest while not forgetting the basics.


Three Surprising Ways Archiving Data Can Save Serious Money


Until recently, backup solutions for enterprises typically fall into two strategies: tape or disk-to-disk (D2D) replication. Both of these solutions come with significant price tags to backup a single terabyte of primary data. The common misconception is that tape backup is cheap. While an actual tape might be cheap, backing up primary data with tape also requires tape libraries, servers, software, data center space, power, cooling, and management overhead. These costs add up very quickly. Our research shows that to backup a single terabyte of primary with tape could cost $138-$1,731 per year, depending on how frequently you are completing a full backup. The other common backup solution – replication – requires backup workflows that replicate data from the primary NAS system to a secondary storage platform from the same vendor. In most cases, this means that the secondary storage system is architecturally similar to the primary NAS device, requiring hardware, software, data center space, power, cooling, and management.



Quote for the day:


"There are many elements to a campaign. Leadership is number one. Everything else is number two.
 -- Bertolt Brecht


No comments:

Post a Comment