Daily Tech Digest - February 17, 2018

The Three Do’s of DDoS protection

The Three Do’s of DDoS protection
Attackers have been putting DDoS firmly in the IT and Network consciousness – and they did it by substantially raising the bar for just how big and disruptive a DDoS attack can now be. ... DDoS attacks are not just growing in strength and frequency, but also diversifying in whom they target and the diversity of DDOS attacks, application layer as well as volumetric. You no longer need to be a big organisation to be impacted by DDoS – everyone is now a target. And as more of us conduct our business on internet-based systems, the risk of costly disruption grows. Attacks are backed by significant malicious resources, and are most effectively countered by the service provider that connects you to the Internet. DDoS attacks can strike at any time; potentially crippling network infrastructure and severely degrading network performance and reachability. Depending upon the type and severity of an attack on a website or other IP-accessible system, the impact can result in thousands or even millions of dollars of lost revenue.



When Streams Fail: Implementing a Resilient Apache Kafka Cluster at Goldman Sachs


Gorshkov reminded the audience of latency numbers that every programmer should know, and stated that the speed of light dictates that a best-case network round trip from New York City to San Francisco takes ~60ms, Virginia to Ohio takes ~12ms, and New York City to New Jersey takes ~4ms. With data centers in the same metro area or otherwise close, multiple centers can effectively be treated as a single redundant data center for disaster recovery and business continuity. This is much the same approach as taken by modern cloud vendors like AWS, with infrastructure being divided into geographic regions, and regions being further divided into availability zones. Allowing multiple data centers to be treated as one leads to an Apache Kafka cluster deployment strategy as shown on the diagram below, with a single conceptual cluster that spans multiple physical data centers.


Can Cybersecurity be Entrusted with AI?

Will AI be the bright future of security as the sheer volume of threats is becoming very difficult to track by humans alone. May be AI might come out as the most dark era, all depends upon Natural Intelligence. Natural Intelligence is needed to develop AI/machine learning tools. Despite popular belief, these technologies cannot replace humans (My own personal opinion). Using them requires human training and oversight. As the results reveal, AI is here to stay and it will have a large impact on security strategies moving forward but side by side with Natural intelligence. Cybersecurity state as on date is too much vulnerable but implementation of AI systems into the mix can serve as a real turning point. These systems come with a number of substantial benefits. These benefits will help prepare cybersecurity professionals for taking on cyber-attacks and safeguarding the enterprise.


What’s Driving India’s Fintech Boom?

Mobile Payments
Industry analysts expect that payments will be a pathway to other areas such as lending, insurance, wealth management and banking. “Most people in India lack credit history. Digital payments give them a credit history which can be leveraged in other areas,” explains Prantik Ray, professor of finance at XLRI – Xavier School of Management. Ravi Bapna, professor of business analytics and information systems at the Carlson School of Management, University of Minnesota, adds: “Innovative data-driven and behavioral risk management models can overcome barriers that arise from lack of widespread and robust credit scoring of individuals.” Rajesh Kandaswamy, research director-banking and securities at Gartner, points out that in mature geographies, payment mechanisms are already evolved and basic banking services are a given. However, in countries like China and India, digital payments are evolving in tandem with the growth in ecommerce.


In a digital world, do you trust the data?

Trust is now a defining factor in an organization's success or failure. Indeed, trust underpins reputation, customer satisfaction, loyalty and other intangible assets. It inspires employees, enables global markets to function, reduces uncertainty and builds resilience. The problem is that - in today's environment - trust isn't just about the quality of an organization's brands, products, services and people. It's also about the trustworthiness of the data and analytics that are powering its technology. KPMG International's Guardians of trust report explores the evolving nature of trust in the digital world. Based on a survey almost 2,200 global information technology (IT) and business decision-makers involved in strategy for data initiatives, this report identifies some of the key trends and emerging principles to support the development of trusted analytics in the digital age.


The Great Disruption of Your Career

Seriously; even coffee shops are now using affordable facial recognition technology with basic CRM to create an amazing experience for customers... "Hi Tony, your triple-shot decaf, skim, soy latte is on its way... did you manage to go water-skiing on the weekend?" Perfect... I'll be able to keep my head down deleting spammy emails while rocking away to Spotify... no need place an order in advance or give eye contact or interact with anyone while securing my morning caffeine fix :-) White collar professions are not immune to the employment apocalypse. Combinations of technology with offshoring to lower cost markets are already biting like a savage dog at your crotch. Do you lay awake at night wondering how you can make yourself indispensable? What do you really do that cannot be automated?


Designing, Implementing, and Using Reactive APIs


Reactive programming is a vast subject and is well beyond the scope of this article, but for our purposes, let’s define it broadly as a way to define event driven systems in a more fluent way than we would with a traditional imperative programming style. The goal is to move imperative logic to an asynchronous, non-blocking, functional style that is easier to understand and reason about.  Many of the imperative APIs designed for these behaviors (threads, NIO callbacks, etc.) are not considered easy to use correctly and reliably, and in many cases using these APIs still requires a fair amount of explicit management in application code. The promise of a reactive framework is that these concerns can be handled behind the scenes, allowing the developer to write code that focuses primarily on application functionality. The very first question to ask yourself when designing a reactive API is whether you even want a reactive API! Reactive APIs are not the correct choice for absolutely everything.


Wireless Reshaping IT/OT Network Best Practices

Wireless Reshaping IT/OT Network Best Practices
IoT, its accompanying cloud services and Big Data analytics, routinely deliver immense and unheard-of amounts of data from devices and sensors. That means network architectures continue to adapt and will change dramatically to implement the data flow from these sensors. That also means networks will become outward focused, as the amount of data acquired from edge devices dwarf the amount of data produced inside the network. Previously, network architecture for wireless used a design that had a wireless access point directly and quickly connected to wired Ethernet. Network backhauls were always wired. However, in more recent times, companies with sprawling multi-building campuses, manufacturing, or process plants, have been using wireless backhauls. Some of these are using WiMAX (IEEE 802.16) as broadband microwave links. Others are designed as optical. These wireless backhauls are significantly less expensive to install, and provide secure data transmission.


GDPR: The Data Subject Perspective

The discussion that followed highlighted a key point: the value of the data means that stakes are high. Organizations are understanding how much value can be driven by intelligent use of data. My opinion is that many individuals have sold themselves short in negotiations around use of personal data. This is because individual data subjects have had limited knowledge, power or influence at a negotiating table that doesn’t really exist – unlike the agreement process for other contracts in which both parties are normally well informed. GDPR implication: The key is intelligent use of data. Personal data which is not managed correctly will have less impact on an organization’s bottom line, and will become a burden under GDPR. Organizations should review their data collection mechanisms and consider data minimisation, and data masking technology to implement privacy by default and design.


A business guide to raising artificial intelligence in a digital economy

The report highlights a need for a fundamental shift in leadership that is required to cultivate partnerships with customers and business partners, and to further accelerate the adoption of artificial intelligence as the fuel for enterprises to grow and deliver social impact. Accenture's 2018 report ...  highlights how rapid advancements in technologies -- including artificial intelligence (AI), advanced analytics and the cloud -- are enabling companies to not just create innovative products and services, but change the way people work and live. This, in turn, is changing companies' relationships with their customers and business partners. "Technology," said Paul Daugherty, Accenture's chief technology and innovation officer, "is now firmly embedded throughout our everyday lives and is reshaping large parts of society. This requires a new type of relationship, built on trust and the sharing of large amounts of personal information."



Quote for the day:


"A wise man gets more use from his enemies than a fool from his friends." -- Baltasar Gracian


No comments:

Post a Comment