Showing posts with label NoSQL. Show all posts
Showing posts with label NoSQL. Show all posts

Daily Tech Digest - February 05, 2024

8 things that should be in a company BEC policy document

Smart boards and CEOs should demand that CISOs include BEC-specific procedures in their incident response (IR) plans, and companies should create policies that require security teams to update these IR plans regularly and test their efficacy. As a part of that, security and legal experts recommend that organizations plan for legal involvement across all stages of incident response. Legal especially should be involved in how incidents are communicated with internal and external stakeholders to ensure the organization doesn’t increase its legal liability if a BEC attack hits. “Any breach may carry legal liability, so it’s best to have the discussion before the breach and plan as much as possible to address issues in advance rather than to inadvertently take actions that either causes liability that might not otherwise have existed, or increases liability beyond what would have existed,” Reiko Feaver, a privacy and data security attorney and partner at Culhane Meadows, tells CSO. Feaver, who advises clients on BEC best practices, training and compliance, says BEC policy documents should stipulate that legal be part of the threat modeling team, analyzing potential impacts from different types of BEC attacks so the legal liability viewpoint can be folded into the response plan.


Many Employees Fear Being Replaced by AI — Here's How to Integrate It Into Your Business Without Scaring Them.

The first goal of integrating AI should be understanding the quickest way for it to start having a positive monetary benefit. While our AI project is still a work in progress, we are expecting to increase revenue anywhere from $2 million to $20 million as a result of a first round of investment of under $100,000. But to achieve that type of result, leaders need to get comfortable with AI and figure out the challenges and complexities they might encounter. ... If you are a glass-half-full kind of person, listening to the glass-half-empty kind of person offers a complementary point of view. Whenever I have ideas to really move the numbers, I tend to act fast. It is crucial that people understand that I am not fast-tracking AI integration because I am unhappy with our current process or people. It is because I am happy that I will not risk what we already have unless I am fully sold on the range of the upside — and I want to expedite the learning process to get to those benefits faster. I still want to talk to as many people as I can — employees, developers, marketing folks, product managers, external investors — both for the tone of responses and any major issues. Those red flags may be great things to consider or I need to give people more information. Either way, my response can alleviate their concerns. 


The role of AI in modernising accounting practices

Accountants, like any other professionals, have varied views on AI—some see it as a friend, appreciating its ability to automate tasks, enhance efficiency, and reduce errors. They view AI as a valuable ally, freeing up time for strategic and analytical work. On the flip side, others perceive AI as a threat, fearing job displacement and the loss of the human touch in financial decision-making. Striking a balance between leveraging AI’s benefits for efficiency while preserving the importance of human skills is crucial for successful integration into accounting practices. ... Notably, machine learning algorithms and natural language processing are gaining prominence, enabling accountants to delve into more sophisticated tasks such as intricate data analysis, anomaly detection, and the generation of actionable insights from complex datasets. As technology continues to evolve, the trajectory of AI in accounting is expected to expand further. Future developments might include more sophisticated predictive analytics, enhanced natural language understanding for improved communication, and increased automation of compliance-related tasks. 


10 ways to improve IT performance (without killing morale)

When working to improve IT performance, leaders frequently focus on the technology instead of zeroing in on the business process. “We are usually motivated to change what’s within the scope of our control because we can move more quickly and see results sooner,” says Matthew Peters, CTO at technology services firm CAI. Yet a technology-concentrated approach can create significant risk, such as breaking processes that lie outside of IT or overspending on solutions that may only perpetuate the issue that still must be resolved. ... A great way to improve IT performance while maintaining team morale is by developing a culture of collaboration, says Simon Ryan, CTO at network management and audit software firm FirstWave. “Encourage team members to communicate openly — listen to their concerns and provide opportunities for skill development,” he explains. “This strategy is advantageous because it links individual development to overall team performance, thereby fostering a sense of purpose.” Ignoring the human factor is the most common team-related blunder, Ryan says. “An overemphasis on tasks and deadlines without regard for the team’s well-being can lead to burnout and unhappiness,” he warns. 


How Digital Natives are Reshaping Data Compliance

With their forward-thinking mindsets, today's chief compliance officers are changing the perception of emerging technologies from threats to opportunities. Rather than reacting with outright bans, they thoughtfully integrate new tools into the compliance framework. This balances innovation with appropriate risk management. It also positions compliance as an enabler of progress rather than a roadblock. The benefits of this mindset are many: A forward-thinking culture that thoughtfully integrates innovations into business processes and compliance frameworks. This allows organizations to harness the benefits of technology ethically. With an opportunistic mindset, compliance teams can explore how new tools like AI, blockchain, and automation can be used to make compliance activities more effective, efficient and data driven. When seen as working alongside business leaders to evaluate risks and implement appropriate guardrails for new tech, compliance teams’ collaborative approaches enable progress and innovation. These new technologies open up possibilities to continuously improve and modernize compliance programs. An opportunity-driven perspective seizes on tech's potential.


How to choose the right NoSQL database

Before choosing a NoSQL database, it's important to be certain that NoSQL is the best choice for your needs. Carl Olofson, research vice president at International Data Corp. (IDC), says "back office transaction processing, high-touch interactive application data management, and streaming data capture" are all good reasons for choosing NoSQL. ... NoSQL databases can break down data into segments—or shards—which can be useful for large deployments running hundreds of terabytes, Yuhanna says. “Sharding is an essential capability for NoSQL to scale databases,” Yuhanna says. “Customers often look for NoSQL solutions that can automatically expand and shrink nodes in horizontally scaled clusters, allowing applications to scale dynamically.” ... Some NoSQL databases can run on-premises, some only in the cloud, while others in a hybrid cloud environment, Yuhanna says. “Also, some NoSQL has native integration with cloud architectures, such as running on serverless and Kubernetes environments,” Yuhanna says. “We have seen serverless as an essential factor for customers, especially those who want to deliver good performance and scale for their applications, but also want to simplify infrastructure management through automation.”


What’s Coming in Analytics (And How We’ll Get There)

The notion of composability is not just a buzzword; it's the cornerstone of modern application development. The industry is gradually moving towards a more composable enterprise, where modular, agile products integrate insights, data, and operations at their core. This transition facilitates the creation of innovative experiences tailored to user needs, significantly lowering development costs, accelerating time to market and fostering a thriving generative AI ecosystem. This more agile application development environment will also lead to a convergence of AI and BI, such that AI-powered embedded analytics may even supplant current BI tools. This will lead to a more data-driven culture where the business uses real-time analytics as an integral part of its daily work, enabling more proactive and predictive decision-making. ... As we advance into the future, the analytics industry is poised on the edge of a monumental shift. This evolution is akin to discovering a new, uncharted continent in the realm of data processing and complex analysis. This exploration into unknown territories will reveal analytics capabilities far beyond our current understanding.


Businesses banning or limiting use of GenAI over privacy risks

Organizations recognize the need to reassure their customers about how their data is being used. “94% of respondents said their customers would not buy from them if they did not adequately protect data,” explains Harvey Jang, Cisco VP and Chief Privacy Officer. “They are looking for hard evidence the organization can be trusted as 98% said that external privacy certifications are an important factor in their buying decisions. These stats are the highest we’ve seen in Cisco’s privacy research over the years, proving once more that privacy has become inextricably tied to customer trust and loyalty. This is even more true in the era of AI, where investing in privacy better positions organizations to leverage AI ethically and responsibly.” Despite the costs and requirements privacy laws may impose on organizations, 80% of respondents said privacy laws have positively impacted them, and only 6% said the impact has been negative. Strong privacy regulation boosts consumer confidence and trust in the organizations where they share their data. Further, many governments and organizations implement data localization requirements to keep specific data within a country or region.


4 ways to help your organization overcome AI inertia

The research suggests the tricky combination of a fearful workforce and the unpredictability of the current regulatory environment means many organizations are still stuck at the AI starting gate. As a result, not only are pilot projects thin on the ground, but so are the basic foundations -- in terms of both data frameworks and strategies -- upon which these initiatives are created. About two-fifths (41%) of data leaders said they have little or no data governance framework, which is just a percentage higher than the previous year's Maturity Index, when 40% of data leaders said they have little or no data governance framework, which is a set of standards and guidelines that enable organizations to manage their data effectively. Just over a quarter of data leaders (27%) said their organization has no data strategy at all, which is only a slight improvement on the previous year's figure (29%). "I get why not everybody's quite there yet," says Carruthers, who, as a former CDO, understands the complexities involved in strategy and governance. ... The good news is some digital leaders are making headway. Andy Moore, CDO at Bentley Motors, is focused on building the foundations for the exploitation of emerging technologies, such as AI.


Data Lineage in Modern Data Engineering

There are generally two types of data lineage, namely forward lineage and backward lineage. Forward Lineage - It is known as downstream lineage; it tracks the flow of data from its source to its destination. It outlines the path that data takes through various stages of processing, transformations, and storage until it reaches its destination. It helps developers understand how data is manipulated and transformed, aiding in the design and improvement of the overall data processing workflow and quickly identifying the point of failure. By tracing the data flow forward, developers can pinpoint where transformations or errors occurred and address them efficiently. It is essential for predicting the impact of changes on downstream processes. ... Backward Lineage -  It is also known as upstream lineage; it traces the path of data from its destination back to its source. It provides insights into the origins of the data and the various transformations it undergoes before reaching its current state. It is crucial for ensuring data quality by allowing developers to trace any issues or discrepancies back to their source. By understanding the data's journey backward, developers can identify and rectify anomalies at their origin. 



Quote for the day:

“Nobody talks of entrepreneurship as survival, but that’s exactly what it is.” -- Anita Roddick

Daily Tech Digest - September 28, 2023

What is artificial general intelligence really about?

AGI is a hypothetical intelligent agent that can accomplish the same intellectual achievements humans can. It could reason, strategize, plan, use judgment and common sense, and respond to and detect hazards or dangers. This type of artificial intelligence is much more capable than the AI that powers the cameras in our smartphones, drives autonomous vehicles, or completes the complex tasks we see performed by ChatGPT. ... AGI could change our world, advance our society, and solve many of the complex problems humanity faces, to which a solution is far beyond humans' reach. It could even identify problems humans don't even know exist. "If implemented with a view to our greatest challenges, [AGI] can bring pivotal advances in healthcare, improvements to how we address climate change, and developments in education," says Chris Lloyd-Jones, head of open innovation at Avande. ... AGI carries considerable risks, and experts have warned that advancements in AI could cause significant disruptions to humankind. But expert opinions vary on quantifying the risks AGI could pose to society.


How to avoid the 4 main pitfalls of cloud identity management

DevOps and Security teams are often at odds with each other. DevOps wants to ship applications and software as fast and efficiently as possible, while Security’s goal is to slow the process down and make sure bad actors don’t get in. At the end of the day, both sides are right – fast development is useless if it creates misconfigurations or vulnerabilities and security is ineffective if it’s shoved toward the end of the process. Historically, deploying and managing IT infrastructure was a manual process. This setup could take hours or days to configure, and required coordination across multiple teams. (And time is money!) Infrastructure as code (IaC) changes all of that and enables developers to simply write code to deploy the necessary infrastructure. This is music to DevOps ears, but creates additional challenges for security teams. IaC puts infrastructure in the hands of developers, which is great for speed but introduces some potential risks. To remedy this, organizations need to be able to find and fix misconfigurations in IaC to automate testing and policy management.


Why a DevOps approach is crucial to securing containers and Kubernetes

DevOps, which is heavily focused on automation, has significantly accelerated development and delivery processes, making the production cycle lightning fast, leaving traditional security methods lagging behind, Carpenter says. “From a security perspective, the only way we get ahead of that is if we become part of that process,” he says. “Instead of checking everything at the point it’s deployed or after deployment, applying our policies, looking for problems, we embed that into the delivery pipeline and start checking security policy in an automated fashion at the time somebody writes source code, or the time they build a container image or ship that container image, in the same way developers today are very used to, in their pipelines.” It’s “shift left security,” or taking security policies and automating them in the pipeline to unearth problems before they get to production. It has the advantage of speeding up security testing and enables security teams to keep up with the efficient DevOps teams. “The more things we can fix early, the less we have to worry about in production and the more we can find new, emerging issues, more important issues, and we can deal with higher order problems inside the security team,” he says.


Understanding Europe's Cyber Resilience Act and What It Means for You

The act is broader than a typical IoT security standard because it also applies to software that is not embedded. That is to say, it applies to the software you might use on your desktop to interact with your IoT device, rather than just applying to the software on the device itself. Since non-embedded software is where many vulnerabilities take place, this is an important change. A second important change is the requirement for five years of security updates and vulnerability reporting. Few consumers who buy an IoT device expect regular software updates and security patches for that type of time range, but both will be a requirement under the CRA. The third important point of the standard is the requirement for some sort of reporting and alerting system for vulnerabilities so that consumers can report vulnerabilities, see the status of security and software updates for devices, and be warned of any risks. The CRA also requires that manufacturers notify the European Union Agency for Cybersecurity (ENISA) of a vulnerability within 24 hours of discovery. 


Conveying The AI Revolution To The Board: The Role Of The CIO In The Era Of Generative AI

Narratives can be powerful, especially when they’re rooted in reality. By curating a list of businesses that have thrived with or invested in AI—especially those within your sector—and bringing forth their successful integration case studies, you can demonstrate not just possibilities but proven success. It conveys a simple message: If they can, so can we. ... Change, especially one as foundational as AI, can be daunting. Set up a task force to outline the stages of AI implementation, starting with pilot projects. A clear, step-by-step road map demystifies the journey from our current state to an AI-integrated future. It offers a sense of direction by detailing resource allocations, potential milestones and timelines—transforming the AI proposition from a vague idea into a concrete plan. ... In our zeal to champion AI, we mustn’t overlook the ethical considerations it brings. Draft an AI ethics charter, highlighting principles and practices to ensure responsible AI adoption. Addressing issues like data privacy, bias mitigation and the need for transparent algorithms proactively showcases a balanced, responsible approach.


Chip industry strains to meet AI-fueled demands — will smaller LLMs help?

Avivah Litan, a distinguished vice president analyst at research firm Gartner, said sooner or later the scaling of GPU chips will fail to keep up with growth in AI model sizes. “So, continuing to make models bigger and bigger is not a viable option,” she said. iDEAL Semiconductor's Burns agreed, saying, "There will be a need to develop more efficient LLMs and AI solutions, but additional GPU production is an unavoidable part of this equation." "We must also focus on energy needs," he said. "There is a need to keep up in terms of both hardware and data center energy demand. Training an LLM can represent a significant carbon footprint. So we need to see improvements in GPU production, but also in the memory and power semiconductors that must be used to design the AI server that utilizes the GPU." Earlier this month, the world’s largest chipmaker, TSMC, admitted it's facing manufacturing constraints and limited availability of GPUs for AI and HPC applications. 


NoSQL Data Modeling Mistakes that Ruin Performance

Getting your data modeling wrong is one of the easiest ways to ruin your performance. And it’s especially easy to screw this up when you’re working with NoSQL, which (ironically) tends to be used for the most performance-sensitive workloads. NoSQL data modeling might initially appear quite simple: just model your data to suit your application’s access patterns. But in practice, that’s much easier said than done. Fixing data modeling is no fun, but it’s often a necessary evil. If your data modeling is fundamentally inefficient, your performance will suffer once you scale to some tipping point that varies based on your specific workload and deployment. Even if you adopt the fastest database on the most powerful infrastructure, you won’t be able to tap its full potential unless you get your data modeling right. ... How do you address large partitions via data modeling? Basically, it’s time to rethink your primary key. The primary key determines how your data will be distributed across the cluster, which improves performance as well as resource utilization.


AI and customer care: balancing automation and agent performance

AI alone brings real challenges to delivering outstanding customer service and satisfaction. For starters, this technology must be perfect, or it can lead to misunderstandings and errors that frustrate customers. It also lacks the humanised context of empathy and understanding of every customer’s individual and unique needs. A concern we see repeatedly is whether AI will eventually replace human engagement in customer service. Despite the recent advancements in AI technology, I think we can agree it remains increasingly unlikely. Complex issues that arise daily with customers still require human assistance. While AI’s strength lies in dealing with low-touch tasks and making agents more effective and productive, at this point, more nuanced issues still demand the human touch. However, the expectation from AI shouldn’t be to replace humans. Instead, the focus should be on how AI can streamline access to live-agent support and enhance the end-to-end customer care process. 


How to Handle the 3 Most Time-Consuming Data Management Activities

In the context of data replication or migration, data integrity can be compromised, resulting in inconsistencies or discrepancies between the source and target systems. This issue is identified as the second most common challenge faced by data producers, identified by 40% of organizations, according to The State of DataOps report. Replication processes generate redundant copies of data, while migration efforts may inadvertently leave extraneous data in the source system. Consequently, this situation can lead to uncertainty regarding which data version to rely upon and can result in wasteful consumption of storage resources. ... Another factor affecting data availability is the use of multiple cloud service providers and software vendors. Each offers proprietary tools and services for data storage and processing. Organizations that heavily invest in one platform may find it challenging to switch to an alternative due to compatibility issues. Transitioning away from an ecosystem can incur substantial costs and effort for data migration, application reconfiguration, and staff retraining.


The Secret of Protecting Society Against AI: More AI?

One of the areas of greatest concern with generative AI tools is the ease with which deepfakes -- images or recordings that have been convincingly altered and manipulated to misrepresent someone -- can be generated. Whether it is highly personalized emails or texts, audio generated to match the style, pitch, cadence, and appearance of actual employees, or even video crafted to appear indistinguishable from the real thing, phishing is taking on a new face. To combat this, tools, technologies, and processes must evolve to create verifications and validations to ensure that the parties on both ends of a conversation are trusted and validated. One of the methods of creating content with AI is using generative adversarial networks (GAN). With this methodology, two processes -- one called the generator and the other called the discriminator -- work together to generate output that is almost indistinguishable from the real thing. During training and generation, the tools go back and forth between the generator creating output and the discriminator trying to guess whether it is real or synthetic. 



Quote for the day:

''You are the only one who can use your ability. It is an awesome responsibility.'' -- Zig Ziglar

Daily Tech Digest - July 19, 2019

How edge computing is driving a new era of CDN

network traffic earth
It’s not that long ago that there was a transition from the heavy physical monolithic architecture to the agile cloud. But all that really happened was the transition from the physical appliance to a virtual cloud-based appliance. Maybe now is the time that we should ask, is this the future that we really want? One of the main issues in introducing edge applications is the mindset. It is challenging to convince yourself or your peers that the infrastructure you have spent all your time working on and investing in is not the best way forward for your business.  Although the cloud has created a big buzz, just because you migrate to the cloud does not mean that your applications will run faster. In fact, all you are really doing is abstracting the physical pieces of the architecture and paying someone else to manage it. The cloud has, however, opened the door for the edge application conversation. We have already taken the first step to the cloud and now it's time to make the second move. Basically, when you think about edge applications: its simplicity is a programmable CDN. A CDN is an edge application and an edge application is a superset of what your CDN is doing.



Despite BlueKeep Warnings, Many Organizations Fail to Patch

Despite BlueKeep Warnings, Many Organizations Fail to Patch
BlueKeep is a serious vulnerability that could enable attackers to compromise Remote Desktop Services in Windows, which enables access to networked computers via remote desktop protocol. Attackers who successfully exploit the flaw could gain full, remote access to a system, including the ability to create user accounts and give them full administrator privileges, as well as to execute any code. "The vulnerability requires no authentication and is regarded as 'wormable,' meaning that if it were successfully exploited it could be used by self-replicating malware to spread across the internet rapidly," security firm Sophos warns in a new report. "WannaCry and NotPetya used a similarly wormable flaw in Microsoft's SMB v1 to spread around the globe in a matter of hours." One saving grace - so far at least - is that security experts have yet to see any in-the-wild attacks that use BlueKeep. But until companies patch, they remain at risk. "Patching, or rather good cyber hygiene, is an integral component of every company's defense against cyberattacks," Raj Samani, chief scientist at McAfee, tells Information Security Media Group.


Microsoft gets boost in SaaS revenue and pushes Teams platform


The company said its Commercial Cloud business achieved annual revenue of $38bn, and grew by 39% in the quarter with revenue of $11bn, while its Intelligent Cloud business grew by 19% with revenue of $11.4bn. The company also reported growth of 23% in the number of commercial Office 365 seats and strong demand for Windows 10 among commercial PC manufacturers driven by end of support for Windows 7 in January 2020. Satya Nadella, chief executive officer of Microsoft, said: “Every day we work alongside our customers to help them build their own digital capability – innovating with them, creating new businesses with them, and earning their trust. This commitment to our customers’ success is resulting in larger, multi-year commercial cloud agreements and growing momentum across every layer of our technology stack.” During the earnings call, Nadella described Teams as Microsoft’s fastest-growing platform. “There is no question this last fiscal year has been an absolute breakout year for Teams in terms of both product innovation and, most importantly, at-scale deployment and usage,” he said.


Digital technologies and the future of geospatial data


Mapping an area correctly can be a painstaking responsibility, but it's easier with help from drones. They work especially well for geospatial analysis needs due to their maximum altitude capabilities of 400 feet and imaging technology that enables capturing ground image data in higher resolutions than satellites or planes. The versatility of drones makes them fantastic for a wide range of mapping projects. For example, a retail brand might use a drone to get details about terrain in the potential location of a new retail store. Then, construction companies can do something similar by factoring drone mapping data into their plans as new buildings or renovations get underway. One of the main reasons why drones are such a hot topic now is because people associate them with the rapid delivery of things they order from e-commerce stores. Although drones do make things more convenient that way, they are also used when companies plan the most efficient distribution routes. Geospatial mapping data offers information to e-commerce enterprises, whether people receive their shipments with drones or through other means.


Does net neutrality still matter in our post-web world?

grant-park-01.jpg
When the phrase was coined, it was in the context of a debate in the US Congress over the idea of a possible nationwide license for broadband service providers. States and municipalities were responsible for granting such licenses to limited geographies, and Republicans in the House were looking for new sources of revenue. Under the provisions of a never-passed law called the COPE Act, ISPs would be given incentives to purchase nationwide licenses instead of more localized ones. One such incentive was a waiver of enforcement of any laws or regulations restricting ISPs' right to divide their pipelines into "good/better/best" service tiers. There was substantive opposition, but Sen. Ron Wyden (D – Oregon) raised the stakes to a moral issue. At issue, he argued, was the small publisher's and garage-based enterprise's right to conduct their business on the same Internet like Google and eBay, as equal players in a digital market. Politically speaking, the concept of net neutrality has been as malleable as sediment from an Oregon mudslide.


Is SQL Beating NoSQL?

What we need is an interface that allows pieces of this stack to communicate with one another. Ideally, something already standardized in the industry. Something that would allow us to swap in/out various layers with minimal friction. That is the power of SQL. Like IP, SQL is a universal interface. But SQL is in fact much more than IP. Because data also gets analyzed by humans. And true to the purpose that SQL’s creators initially assigned to it, SQL is readable. Is SQL perfect? No, but it is the language that most of us in the community know. And while there are already engineers out there working on a more natural language-oriented interface, what will those systems then connect to? SQL. ... SQL is back. Not just because writing glue code to kludge together NoSQL tools is annoying. Not just because retraining workforces to learn a myriad of new languages is hard. Not just because standards can be a good thing. But also because the world is filled with data. It surrounds us, binds us. At first, we relied on our human senses and sensory nervous systems to process it.


Container security improves overall enterprise IT posture  


Once apps reach the production Kubernetes environment, security policies enforced through Aqua allow all developers and IT ops pros read-only access to their activities. This improves and speeds up application development, and lets IT pros troubleshoot faster than they could with VMs -- in the past, Recurly's security staff more carefully restricted such access without automated whitelisting tools available for containers. Also, since containers separate application processes from the underlying host, admins can more strictly lock down the host itself with tools such as Google's Container-Optimized OS. "We are heavily running immutable hosts today, so even if you break out of a container and get on a host, good luck," Hosman said. "You can't run anything, install anything, or pivot to anything, and if we restart the host, everything just resets." Recurly's goal is to move away from human responses to alerts, whether they refer to IT monitoring or container security issues, and toward a remediation response to issues through code.


Microservices: Myth, Madness, or Magic?

The reality is, you almost always don't need microservices to achieve the above "holy grail", you just need a decent architecture. So let's redefine microservices: Microservices: Yet another concept to fix the bad architecture created by bad software developers and to make money for big businesses that feed on the bad software practices of others. An article on Medium writes "Conceptually, Microservices extend the same principles that engineers have employed for decades."2 Wrong. These principles have existed for decades, but "employed?" Hardly ever. Similarly, a post on New Relic states: "When using microservices, you isolate software functionality into multiple independent modules that are individually responsible for performing precisely defined, standalone tasks. These modules communicate with each other through simple, universally accessible application programming interfaces (APIs)."3 Wait, we need microservices to achieve this? Wasn't this the promise of OOP? Isn't this the promise of every newfangled framework like MVVM, Angular, and so forth?


Microsoft to explore using Rust

Rust
"A developer's core job is not to worry about security but to do feature work," Thomas said. "Rather than investing in more and more tools and training and vulnerability fixes, what about a development language where they can't introduce memory safety issues into their feature work in the first place? That would help both the feature developers and the security engineers-and the customers." Microsoft looking into Rust, as a safer alternative to C++ isn't actually such a big deal. The OS maker has been looking for safer C and C++ alternatives for years. In June 2016, Microsoft open-sourced "Checked C," an extension to the C programming language that brought new features to address a series of security-related issues. Microsoft looking into Rust before any other memory-safe language is also not a bad decision. Besides being superior to C# in regards to better memory protections, Rust is also more popular with developers these days and might be easier to recruit for. ... Developers love it because of its simpler syntax and the fact that apps coded in Rust don't yield the same amount of bugs, allowing developers to focus on expanding their apps, instead of doing constant maintenance work.


Data governance in the age of AI: Beyond the basics

Ensure governance team members have defined roles, including tactical and high-level strategy responsibilities, Smithson says. Split data champions into two groups: data stewards, who make recommendations about formulas or algorithms, for example, and director- or VP-level data owners who make the decisions, Walton adds. And put roles and responsibilities into job descriptions. “The job responsibilities come from the workflows and the tasks that need to be accomplished.” Those job descriptions should fall into two buckets, he says: data quality assurance and information consistency. For the former, tasks include identifying a data quality issue, remediating that issue with a workflow change, for example, and monitoring to ensure the effectiveness of the data governance initiative. For the latter, tasks include creating a business measure to support key performance indicators, to modify it when business rules change, and to sunset any items that are no longer relative. A bonus tip: Tie data owners’ bonuses to data quality. “That will get people’s attention,” Walton says.



Quote for the day:


"Leadership is the art of giving people a platform for spreading ideas that work." -- Seth Godin


Daily Tech Digest - January 25, 2018

How policymakers should approach AI
AI is already super-human in many domains and in the next 5-20 years it is quite likely that we will be able to capture and express all of extant culturally-communicated human knowledge with it. Already we are far better at predicting individuals' behaviour than individuals are happy to know, and therefore than companies are happy to publicly reveal. Individuals and parties exploiting this are very likely compromising democracy globally, notably in the UK. There is an incredibly large project here for the social sciences and the humanities as we urgently address the political, economic, and existential (in the philosophical sense) challenges of massive improvements in communication, computation, and prediction. Again, natural laws of biology tell us to anticipate an accelerated pace of change given the increased plasticity of increased intelligence. Therefore we need to ensure our societies are robust to this increase, with sufficient resilience built into the system to allow individuals to have periods out of work finding a new place in the economy.



Implement OAuth in 15 minutes with Firebase

Fire
This article provides a 15 minute, step-by-step guide to adding OAuth support to a CLI-generated Angular application using Firebase. We will implement OAuth with a Google account, but other platforms supported by Firebase include: Facebook, Twitter, and GitHub. But first, what is Firebase? Firebase got its start as a realtime cloud-hosted NoSQL database supporting multi-user synchronization. Since being acquired by Google in October of 2015 it has become an entire publishing platform for web and mobile applications. Many major companies, including Lyft, Shazam, The New York Times, and NPR, use Firebase to support their apps . Some of these applications see over 100 million monthly users and update the database more than 3,000 times per second, providing strong evidence that the platform can scale.


Why NoSQL Needs Schema-Free ETL Tools

Even developers don't like writing boring "plumbing code" — code that just links data from one place to another. It's dull and repetitive. Customers don't like it, either — as anywhere that code is needed inevitably means a maintenance headache, not to mention a long time to write and test it in the first place. This means increased costs to initially deploy a new technology like NoSQL. Equally, on the output side, if you can't rapidly visualize the insights you can glean from your data, then you cannot fully realize the benefits of your investment in NoSQL database technology. Trying to code around the problem leads to longer project times, and the aforementioned increase in costs associated with custom coding. Many NoSQL companies have tried to shoe-horn SQL support into their products in an effort to bridge the gap between traditional BI vendors and their products. This has only been partially successful. 


NHS Wales IT outage: What went wrong with its datacentres?


Guillaume Ayme, IT operations evangelist at big data analytics software supplier Splunk, raised concerns about the datacentres’ setup, given that running dual sites usually means that in the event of an outage, one will failover to the other. “For the issue to be impacting two datacentres suggests it is severe, as one would normally be the backup for the other,” he said. “This may suggest there has been a problem in the failover procedure. “Once the service is restored, it will be essential to find the root cause to avoid a potential repeat. This can be complex for organisations that do not have full visibility into the data generated by their IT environment.” ... “While systems are now back up and running, the chaos it created shows why we need to move from hours to minutes to resolve problems like this,” said Anderson. “Ultimately, it comes down to our reliance on software and the need for it to work perfectly – and that’s difficult in IT environments that are getting more complex by the day.


Exploring Modern Data Platforms (Podcast)

The DevOps thing is something that everyone is trying to get their head around right now. When you have a whole staff of people who know SQL and know relational databases and now we say, ‘Okay, but all of your data is going to go to an object data store.’ Like, what does that look like or how should the data be organized? How do you query it? How do you use it? That type of training, but, to be honest, that really is not as much of a leap as it was even a year ago. The evolution is happening very, very, very rapidly. A year or two ago we’d say, ‘You need to use an object data store’, and we were speaking some foreign language. Now they get it, and they say, ‘Okay, let’s do it,’ because they think what’s happened is over the years people started dipping their toes and they’re realizing the economics of it. It’s like Hadoop was the gateway drug for this type of platform where you could start experiencing drastic cost reduction with enhanced capabilities


How CIOs Can Ensure a Seat at the Strategy Table


Digital disruption has placed technology at the heart of most business discussions, yet many CIOs are still fighting for a seat at the strategy table. Monika Sinha, research director at Gartner, says information and technology are considered too late in the strategy process in many enterprises. “IT is fundamental to the new business challenges,” Sinha told CIOs at Gartner Symposium/ITxpo in Goa, India this week. “It underpins new business models, products and services that are disrupting existing industries and creating new markets. As strategists, CIOs are flexible, agile and opportunistic in their approach.” “Once your ambition is clear, appropriately position IT at the heart of your business strategy.” The traditional “wait and respond” approach to enterprise strategy – the business strategy is finalized, the CIO reviews the strategy for IT’s contribution and an IT strategy is developed in response – is no longer viable.


The Benefits Of Open Standards For Process Automation


Once we see overall total cost of ownership of these process automations systems being reduced in the long run, we’ll be able to take advantage of the built-in, intrinsic cybersecurity features that are being designed into these open process automation systems. The rapid insertion of new technologies, new capabilities, and innovations will be inserted into the formerly closed systems in a much faster and cheaper way. Ultimately, that translates in manufacturing to increased equipment reliability, faster time to market, increased quality of production, and other benefits. ... It’s important to also remember that the intellectual property (IP) of those vendors is still preserved. There are points where we’re breaking up existing hardware and software systems into modules. With the modules, there will still be the intellectual property of the vendors—but the interfaces in between are what’s standard. In the future, there still will be the IP of the vendors in the hardware and software and in the application layer.


Cozy is building a personal cloud service that respects your privacy


Instead of creating yet another ecosystem of hosted services financed by ads, Cozy wants to change the balance and create a platform where the user is in charge. As you can read in the terms of services, you remain the owner of your data and your data won’t be shared with anyone unless you give your consent. And for the most privacy-concerned users, you can also install a Cozy instance on your own server. The main GitHub repositories have been updated today. The company just unveiled the first services of this new platform today. First, it starts with a good old file-syncing service. With Cozy Drive, you can install an app on all your computers, synchronize files with Cozy’s servers and find them everywhere — on your other computer, on your phone or on the web. Second, Cozy Photos lets you backup your photos. This works like Google Photos, Microsoft OneDrive’s Camera Upload and similar features.


IIoT and the Open Process Automation Forum

OPAF envisions a future open control system that will take information and data from any device and optimize it for better decision making. It will empower the workforce to be more actively involved and responsible for good business outcomes. For example, secondary measures will be key, such as differential pressure or sensor temperatures. We will be able to collect and communicate data about the overall health status of the instrument or sensor, which will drive new levels of reliability and overall operational integrity and profitability. This new level of control and new control functions will drive incredible value. Fitzgerald: Much depends on the scale and relevant policies of a given client. While DHCP might be “easier” for both wired and wireless integrations, discrete IP addresses associated with given subnets provide additional needed security and robustness of operations.


Robots are needed to manage the automation robots


Dube says the combination of physical robotic machine bodies and AI software brains will eventually make it hard to tell humans and robots apart. “We are carbon-based organisms and robots are silicon-based, but I think the boundaries around them are going to get progressively diffused to the extent that you will not be able to distinguish between a human and an android in the next nine years,” he says. “Robots are becoming fairly smooth in terms of mechanical motion. They can easily walk through a crowded mall, avoiding people. They can take an escalator, climb down stairs and even run faster than humans. In five years, their dexterity will be as good as humans. “But one component is missing – the brain – and that is the area we specialise in. When we implant the brain into the robot frame, it will be able to be asked a question, analyse what was said, and provide an answer. It will be able to walk and talk to you.”



Quote for the day:


"A leader must have the courage to act against an expert_s advice." -- James Callaghan


August 24, 2016

A Portable Hard Drive Made For Mobile Streaming

Unlike its very thick predecessor, the My Passport Wireless Pro could easily be mistaken for a portable optical drive (you remember those, right?). Except that this enclosure sports a micro-USB 3.0 connector, a USB 2.0 Type A port (for charging other devices from the drive’s battery), and an SD memory-card slot (for transferring files—automatically on insert, if you so choose. You can push a button if you don’t.) The new model weighs in at nearly a pound--that's four ounces heavier than the original--and we're pretty sure it's attributable to the 6400 mAh battery. ... Streaming was a mixed bag of easy and not so easy. This wasn’t the Wireless Pro’s fault, but the uneven implementation of streaming protocols across platforms.


Android 7.0, Nougat: The complete FAQ

The way split-screen mode works in Nougat is pretty simple, though the function is a bit hidden: While using an app, you press and hold the Overview key (the typically-square-shaped button next to Back and Home). That splits the screen in two, with your current app on top (or left) and a list of your most recently opened apps on bottom (or right). ... Updated appearances aside, notifications in Nougat are bundled by app -- so if you have, say, three new email alerts from Gmail, they'll all appear within a single card in your notification panel. ... Android's Quick Settings gets far more useful with Nougat, thanks to a new set of always-present toggles on top of the regular notification panel (illustrated above) and a newly customizable set of tiles when you swipe down from that view.


Google's Cloud Bigtable Database Handles Petebyte-Scale Workloads

Google this week alsoannounced general availability of its Cloud Datastore managed NoSQL database service and talked up its existing and forthcoming support for applications built in Microsoft's ASP.Net environment. Cloud Bigtable is a technology that Google has used internally for several years. It powers many of Google's most heavily used services, such as Gmail, Search, Maps and Analytics. It is designed to handle very large data sets at high speeds. According to Google, that makes it well-suited for analytical and operational applications, such as financial data analysis, internet of things and user analytics. Google has previously described Bigtable as delivering more than double the performance of other NoSQL technologies, such as Cassandra and Hbase, while running faster and delivering a lower total cost of ownership.


IT Investment Uptick Triggered By Productivity Goals

"In the long-run, productivity gains are mostly generated through innovations in technology and in the way that businesses manage people and technology," Ira Kalish, chief global economist for Deloitte Touche Tohmatsu, wrote in the report. "One problem is that new innovations, while always exciting, don't necessarily lead to productivity gains immediately. Rather, it can take years before innovations are absorbed into the way businesses operate, only then causing gains in productivity that lead to faster economic growth." Looking ahead, those companies surveyed noted that the greatest opportunity is in internet of things-powered technology that tracks business processes, with the ability to track customer behavior and the possibility of tracking employee productivity high on the list of capabilities businesses said they were pursuing in this area.


34 Most Disruptive Technologies of the Next Decade

For those who associate the term "hype" with failure, realize that that's what this report is bringing into focus. Instead, it highlights "the set of technologies that is showing promise in delivering a high degree of competitive advantage over the next five to 10 years," Mike J. Walker, research director at Gartner, said in a statement. The phases of the hype cycle, as outlined in a graph created by Gartner, are as follows: Innovation Trigger, Peak of Inflated Expectations, Trough of Disillusionment, Slope of Enlightenment, and finally, Plateau of Productivity. Basically: There's a breakthrough, a flurry of press coverage touting successes, a bunch of failures that ultimately contribute to disillusionment, then people start to understand the technology more, and it goes mainstream.


Deconstructing the development mindset

A development mindset is a pattern of thinking and a way of looking at the world that invites ongoing opportunities for continuous individual and organizational transformation. It’s an abundant perspective that recognizes significance that others might overlook. Those with a development mindset appreciate that development is a state of mind, not a series of discrete activities or classes. ... Given the environmental impediments to promoting a development mindset, it would be easy to simply throw our hands up and declare defeat. But savvy leaders who understand the long-term benefits to individuals and the organization can choose to take steps to create more hospitable and supportive conditions for their employees.


Cisco well positioned to dominate cybersecurity market

The “big data” approach is the foundation of Cisco’s “Network as a Sensor” and “Network as an Enforcer” strategy. Because of its dominant share in networking, the company has more devices in more places than any other vendor. Also, it has a wealth of information available to it, including log files, NetFlow, DNS information, identity, IP address records and other network-related data that can help it quickly find anomalies and breaches. Industry-wide, the average time taken to find a breach today is 100 days. Cisco’s senior vice president and general manager of networking and security, David Goeckeler, told me Cisco could find breaches in 17 hours. I challenged him on this point and said 17 hours is still far too slow.


How to get your network and security teams working together

So, for a team focused on speed and availability, security can often be seen as a roadblock in reaching those goals -- and vice versa. "This becomes a problem when network professionals feel that security measures are red tape getting in the way of their processes, and security professionals feel that network team's expansion and development of complex architectures are opening up the system to potential attacks," says Vigna. It's not that security isn't important to networking professionals, it's just that it isn't necessarily their focus. And the same goes for security pros. They don't want things to run slower or to create more steps for people, but it is their job to keep things as secure as possible. And as it becomes increasingly important for businesses to avoid any security breaches -- both teams will need to shift their priorities.


An iPhone feature has exposed a biometrics security flaw

The vulnerability is unlikely to present a serious threat to security, for now. Banks that employ facial recognition technology generally use it alongside other security measures — like requiring users to have a lock on their phone or only allowing a customer's account to be accessed from a single registered device. Exploiting the weakness would also require a hacker to have both the victim's phone and a Live Photo of them, which is an unlikely scenario. But this development suggests that banks should think carefully about how they use biometrics. Only 9% of UK consumers are happy to use facial recognition as a means of identification, according to Experian, and stories like this are likely to further dent consumer confidence. This implies that banks should continue to use biometrics as an additional or optional security measure, rather than a replacement for existing methods.


New report confirms you need NoSQL, and probably in the cloud

NoSQL is not an option—it has become a necessity to support next-generation applications. And increasingly, enterprises of all types and sizes are embracing NoSQL to support their business technology (BT) agenda. A key strength for NoSQL is the ability to support scale-out architecture leveraging low-cost compute servers that are clustered to deliver performance of large, high-end SMP servers. In addition, its flexible schemaless model offers the ability to store, process and access any type of customer and business data. ... NoSQL delivers one side of the business agility equation, allowing for disparate data types at high velocity and volume. Public cloud takes care of the infrastructure side of the equation, enabling enterprises to grow or shrink resources according to data demands.




Quote for the day:

"A vision needs to be shared in a consumable way and integrated into business plans, each decision, each procedure and each employees' tasks." -- @RichMcCourt


August 04, 2016

What's In A Security Score?

Security scores are used by cyber insurance underwriters to evaluate a company’s potential risk, by companies to evaluate the cyber-risk posture of third-party vendors and partners, and by senior executives to explain a company’s cyber risk to its board of directors with an easy-to-understand rating. “The third-party risk management is the one we see growing the most rapidly,” says Jeffrey Wheatman, research director, security and privacy, at Gartner. “We think that at some point in the near term, a cybersecurity score will be as important as a credit score when organizations look to sign up for a partnership.”


Google DeepMind: The smart person's guide

DeepMind is a subsidiary of Google that focuses on AI. More specifically, it uses a branch of AI called machine learning, which can include approaches like deep neural networks and reinforcement learning to make predictions. This can rely on massive data sets, sometimes manual data labeling—but sometimes not. Many other AI programs like IBM's DeepBlue, which defeated Garry Kasparov in chess in 1997, have used explicit, rule-based systems that rely on programmers to write the code. However, machine learning enables computers to teach themselves and set their own rules, through which they make predictions. In March 2016, DeepMind's AlphaGo program beat world champion Lee Sedol in 4 out of 5 games of Go, a complex board game—a huge victory in AI that came much earlier than many experts believed possible.


Build a Strong Security Baseline with the HIPAA Security Rule

“In addition to having updated systems, it’s also beneficial to monitor what is going on within a system,” Fisher said. “Whether it be looking for suspicious emails or suspicious activity, you then need to be able to quickly respond to or isolate that activity. Even if you can’t prevent an attack, at least if you can limit the extent of it, or the length of time in which it can occur, you can begin to mitigate those potential damages or potential harm that’s coming out of it.” If there has been a successful attack, entities need to try and lock down the system as quickly as possible to stop further spread of harm. Furthermore, as required under HIPAA regulations, a good disaster recovery plan and comprehensive data backup should also be the top of an organization’s security priorities.


NIST wants agencies to move away from SMS authentication

“While a password coupled with SMS has a much higher level of protection relative to passwords alone, it doesn’t have the strength of device authentication mechanisms inherent in the other authenticators allowable in NIST draft SP 800-63-3,” Grassi wrote. “It’s not just the vulnerability of someone stealing your phone, it’s about the SMS that’s sent to the user being read by a malicious actor without getting her or his grubby paws on your phone.” NIST stopped short of removing the SMS guidelines entirely, due to the fact that the text messages may still work for existing government systems. However, NIST hopes the deprecation pushes agencies to re-assess their two-factor practices as they modernize their systems.


White House to Fund Tech Growth ‘Beyond Moore’s Law’

The NSCI is all in favor of partnership and collaboration. But with respect to finding a new track for sustained performance growth over the long haul, it’s looking to principles that, as of today, still sound like science fiction. “The NSCI envisions a more heterogeneous future computing environment, where digital (von Neumann-based) computing is augmented by systems implementing alternative computing paradigms to efficiently solve specific classes of problems,” reads the group’s current report. “These alternative computational paradigms — whether quantum, neuromorphic or other alternatives — may solve some classes of problems that are intractable with digital computing, and provide more efficient solutions for some other classes of problems.”


Strate, global CSDs to collaborate on blockchain use

“It sounds so simple for me to give you shares and you get cash in exchange and then the deal is done. But when you start getting into things like corporate actions, dividend payments, taxes that have to be paid, reporting things, liquidity requirements and securities lending and borrowing, you unpack this whole can of worms that needs to be dealt with,” Knowles said. She said the effective, lawful use of the technology in financial markets would require the use of a permissioned blockchain and oversight by an independent third party. “To something as high risk as the financial markets, it does need regulation, it does need standards, it does need governance and it does need some sort of overseer of the entire ecosystem,” she said,


Bitcoin exchange hack highlights security weaknesses

“Although bitcoin itself is inherently secure, a hacker can steal the keys to your wallet if you don’t store the keys securely. This isn't an inherent flaw of the bitcoin protocol, and this is what happened with Bitfinex,” he said. Al-Bassam said although there has been progress in the past few years with technology to allow secure wallets, such as hardware wallets and cold wallet software, there is still a lot more to be done. “Users who store a large amount of Bitcoin in an exchange should be aware that if they don’t have the cryptographic keys to their Bitcoin, they don’t have total control over it,” he said.


Cloud denial sliding into oblivion

The only way to completely prevent cloud usage is to shut down internet access to users. Essentially, the modern equivalent of what you only see in spy movies: a sealed network, custom-made computers with no USB port, no external hard drive, and employees are searched on their way in and out of the office. Except that "no-internet" is not really practical in the twenty-first century. Barring a sealed network, users will bypass rules and use cloud services! It can be as simple as using a file sharing system such as Dropbox to send files to colleagues. Or signing up for cloud-based analytics services in which they will upload company data to get nice reports. It can also go all the way to provisioning a mission-critical business application or a data backend for a mobile app, without having to go through IT.


IoT Will Surpass Mobile Phones As Most Connected Devices

The Ericsson report notes that many things will be connected through capillary networks, which will leverage the ubiquity, security, and management of cellular networks. The result could create a lot of opportunity for IT, as well as challenges related to security and management. Currently, about 70% of cellular IoT modules are GSM-only, with network mechanisms being implemented to foster extended network coverage for low-rate applications. The second market segment -- critical IoT connections -- are characterized by requirements for ultra-reliability and availability, with very low latency, such as traffic safety, autonomous cars, industrial applications, remote manufacturing, and healthcare, including remote surgery.


Virtual Panel: Current State of NoSQL Databases

It's clear to me that the relational databases are more mature in their integration with developer tooling than the NoSQL databases, that's just a function of time. But that is rapidly changing as the NoSQL market shakes out and the database and tooling vendors begin to consolidate around a small number of front-runners, supported by an enthusiastic OSS community. In Neo4j specifically we've been working hard over the last 5 years to produce a very productive query language called Cypher that provides humane and expedient access to the graph. That language is now in the early stages of standardization as "openCypher", and will appear as the API to other graph technology over time (e.g. there is an initiative to port Cypher to Spark). In our recent 3.0 release we worked hard to make access to the database boringly familiar.



Quote for the day:


"Chance has never yet satisfied the hope of a suffering people." -- Marcus Garvey


February 29, 2016

How do you define great IT leadership?

"Being recognised as the person that is going to drive innovation and help the company be more successful than it is today is a great way to show the important role you play," he says. While communication skills are crucial, great leaders do not necessarily have to assume the mantle of a spokesperson. ... "Your results should speak for themselves. Personal knowledge and experience can be built over time. If you bring specific industry knowledge, actively engage with peers in their language to understand their business challenges, then you can be confident that you will be recognised as a critical part of your organisation's competitive advantage."


US law will restore trust in transatlantic data flows, says EU commissioner

"[This] will pave the way for the signature of the EU-US data protection umbrella agreement. This agreement will guarantee a high level of protection of all personal data, regardless of nationality, when transferred across the Atlantic for law enforcement purposes. It will strengthen privacy, while ensuring legal certainty for transatlantic data exchanges between police and criminal justice authorities. This is crucial to keep Europeans safe through efficient and robust cooperation between the EU and the US in the fight against crime and terrorism," Jourová said. The data protection 'umbrella' agreement, a new privacy framework that will apply to personal data transferred to US law enforcement agencies, was announced by the European Commission last September, although it will not apply until EU law makers ratify it.


3 Ways to Build an Outstanding Company Culture

Engaging in constructive dialogue holds more value than simply measuring NPS scores, Cain adds. His team recognizes that unsolicited feedback offers granular insight into what truly matters to employees and customers. Also, instead of incenting employees based on quotas and numbers targets, Avnet offers informal rewards for behaviors that increase or reinforce customer relationships. Focusing on quality over quantity empowers employees to pursue and fulfill their shared mission. "The top-notch service, support, and expertise that we provide to partners and customers will not be possible without our self-motivated and professional employees who live up to the core values of Avnet," Lim says


How Serverless Applications Will Change Your Business

Even with serverless applications, not everything happens in the cloud, nor does all functionality come from the cloud. There's still a need for on-premises developers "who control the end-user experience," said Emison. These developers should assume the end-user part of the application is running on a powerful smartphone, tablet, or other mobile device. A substantial part of the application logic can reside there, given the growing power of the devices. In that sense, Web applications, which put all the logic on an Internet server and give the end user a browser or other form of thin user interface display, have been re-architected. In serverless applications, the user's experience is determined by the business logic on the end-user device, as well as the Internet data center server, and it represents a significantly larger share of the application than a display window.


CEO training critical to cyber resilience, says APMG

“In part due to a lack of free time and in part due to a perceived view of cyber security as tangential to their core role, CEOs often overlook cyber training. Taking into account the number of cyber attacks that have become public in the past 12 months or so, any large organisation must view a breach as inevitable. “To deal with the range of threats faced by an organisation on a daily basis, its cyber security strategies must consider all possible technical or cultural factors that pose a degree of risk. With the right skills in place, an appropriate response to threats can be effectively communicated across the whole organisation in a common language,” he said.


The best media and methods for archiving your data

Active archiving has nothing to do with hard drives, per se. It’s simply the act of shuttling data between media in a storage area network or SAN with the goal of keeping the most frequently accessed data on the fastest media (RAM or SSDs) and the least frequently accessed data on slower tape or optical, with hard drives somewhere in middle. ... Don’t bother with trivial or unfinished data. Archive only irreplaceable data that’s in its final state: legal or financial documents, important memorabilia, your creative efforts, etc. If you can download it again, reinstall it, or if you are still working on it, don’t bother—you’ll just waste time and space. Let your everyday backup take care of it. Also take the opportunity to de-duplicate and prune your data before you archive.


How the Internet of Things is becoming the 'Internet of Commerce’

The maturation of mobile payment services combined with the proliferation of IoT-capable devices has created a perfect storm of innovation that's seeing our money going places it never could, both securely and conveniently. And thanks to innovators like MasterCard, the Internet of Things is moving from pure connectivity, to all-out functionality. Consider this: When the world was first introduced to IoT, it was "enough" to fantasize about controlling objects around you, like programming your home's thermostat from your phone. But control only scratches the surface. When MasterCard launched its Commerce for Every Device program last October, the payment innovator declared that any connected device — not just a smartphone or smartwatch — could become payments-enabled.


Data Center Security Is an Inside Game

Micro-segmentation addresses this new security challenge by distributing the security functions across all servers and machines, right at the source where applications reside (as opposed to concentrating security deep down in the physical network). Done correctly, micro-segmentation can enable 100 percent protection of data center traffic, in a simple and scalable manner. The intent is to secure data centers from inside and protect east-west traffic using fine-grained security policies. It’s worth noting that micro-segmentation isn’t limited to the east-west direction only – it is a comprehensive, 360-degree approach to protecting all data center traffic, in a modern scalable way. Is it feasible to put this new security shield around existing and new applications?


Apache Spark vs. Apache Flink – Whiteboard Walkthrough

To give you a good analogy, imagine collecting water in a bucket, flowing water in a bucket, and then pouring it out, vs. putting in a pipe there and letting water flow continuously without any intermediate delays. That's essentially the difference between a micro-batch and a continuous flow operator.  Spark essentially started as a batch processor, and eventually started adding more and more capabilities that make it more often real-time streaming processing as well. Flink ,which initially during its research stages, started solving problems around batch, but along the way, its researchers identified several interesting challenges in the real-time streaming paradigm. As a result, they pivoted more from a continuous flow operator-based model and kind of treated batch as a special case of real-time streaming.


Unified Data Modeling for Relational and NoSQL Databases

Current relational databases all follow the 3rd normalization. With ACID transaction model (Atomic, Consistent, Isolated, Durable), it is good to use relational databases when one data set has only one copy in the database. It means modifying one copy at a time. However, data needs aggregation when it’s queried from multiple different applications. So data needs to be distributed, and data schema needs to be de-normalized according to the business requirements. Schemas should be designed for enabling distributed query. This requires each data set to contain enough information to run the executed queries separately in different data nodes. Based on the above, using logical model describing business requirements and de-normalizing schema to physical data model is fundamental when building NoSQL databases.



Quote for the day:


“The path of cultivating excellence is practice. And not just any practice...” -- Bob Dunham