Daily Tech Digest - July 18, 2021

How to answer the age-old question: Could this meeting have been an email?

Companies want their people to be productive and their processes and systems to be efficient. “From an efficiency perspective, meetings to discuss a challenging issue or to make a decision are a good investment of time,” says Christie. “It is a better use of the team’s time to get them in a room together for 30 minutes to debate an issue or make a decision, than it is to send multiple emails seeking input, to read all of the input, to synthesize the input, and then arrive at a decision which then needs to be disseminated.” ... Before you send out a calendar invite, consider that meetings are multiples of their attendees, says Burns. “If a bad six-minute meeting has 10 attendees, that’s an hour of wasted productivity,” he says. “Now imagine a bad 30-minute meeting with three people or more.” Meetings are requests for someone’s time, so carefully evaluate the its cost or benefit, says Janardan. ...  Resist the urge to default to a meeting, says Janardan: “Remind yourself of all the things that got resolved when you had those casual run-ins at work and try to recreate them—whether you are in-person or not. Sometimes the fastest way to come to a solution is to just pick up the phone or physically seek out a colleague if you are spending more time in a physical office.”


Researchers Create New Approach to Detect Brand Impersonation

These attacks, in which adversaries craft content to mimic known brands and trick victims into sharing information, have grown harder to detect as technology and techniques improve, says Justin Grana, applied researcher at Microsoft. While business-related applications are most often spoofed in these types of attacks, criminals can forge brand logos for any organization. "Brand impersonation has increased in its fidelity, in the sense that, at least from a visual [perspective], something that is malicious brand impersonation can look identical to the actual, legitimate content," Grana explains. "There's no more copy-and-paste, or jagged logos." In today's attacks, visual components of brand impersonation almost exactly mimic true content. ... While too many types of content can present a detection challenge, too few can do the same. Many brands, such as regional banks and other small organizations, aren't often seen in brand impersonation, so there might only be a handful of training examples for a system to learn from. 


The 10 temptations you should not fall into as a leader

The path of leadership is plagued with complex situations, which merit making unpleasant decisions. A layoff, sanctions of different kinds, tackling a tactic, abandoning a long-standing customer or an established supplier. In this sense, a frequent error of being observed happens because, due to lack of courage or fear of losing the admiration of collaborators, one avoids taking this type of action, with the naive idea of assuming that “everything happens”. On the contrary, each and every one of these situations that are not resolved tend to increase their emotional volume as the days go by and constantly acquire an increasingly bitter taste. ... Leaders who fall into this temptation have a clear component of insecurity, feeling that if they do not know everything that happens, they could be in danger. Then, people who use the logic of mistrust as a standard, will place a collaborator "of his kidney" whose main responsibility will be to tell him everything that his colleagues say, do or even think. The effects of this are lethal. The credibility of the leader is undermined and collective mistrust is strengthened, seriously affecting the transparency that the team culture requires as the oxygen for its operation.


Beginner’s Guide To Machine Learning With Apache Spark

Spark is known as a fast, easy to use and general engine for big data processing. A distributed computing engine is used to process and analyse large amounts of data, just like Hadoop MapReduce. It is quite faster than the other processing engines when it comes to data handling from various platforms. In the industry, there is a big demand for engines that can process tasks like the above. Today or later, your company or client will be asked to develop sophisticated models that would enable you to discover a new opportunity or risk associated with it, and this all can be done with Pyspark. It is not hard to learn Python and SQL; it is easy to start with it. Pyspark is a data analysis tool created by the Apache Spark community for using Python and Spark. It allows you to work with Resilient Distributed Dataset(RDD) and DataFrames in python. Pyspark has numerous features that make it easy, and an amazing framework for machine learning MLlib is there. When it comes to huge amounts of data, pyspark provides you with fast and real-time processing, flexibility, in-memory computation and various other features.


Facebook AI Releases ‘BlenderBot 2.0’: An Open Source Chatbot

BlenderBot 2.0 is better at conducting more extended, more knowledgeable, and factually consistent conversations over multiple sessions than the existing state-of-the-art chatbot. BlenderBot’s improved conversational abilities have made it a serious contender for artificial intelligence research. The AI model takes the information it gets from conversations and stores them in long-term memory. The knowledge is stored separately for each person they speak to, which ensures that new information learned in one conversation can’t be used against another. This model can read and respond in real-time, making it an excellent tool for keeping up with current events. It can scan the internet for new information to have a more up-to-date conversation. Facebook AI Research is releasing the complete model, code, and evaluation set up to help advance conversational AI research. The Facebook team combined human conversations with internet searches that have been bolstered for training purposes.


The Evolving Role of the CISO

In the era of the digital workplace, CISOs must not only focus on preventing threats, but create systems that work for the business and still keep everyone protected. Constant innovation, creation and implementation of unique strategies are already part of the CISOs job description. ... Decision-making that ties business strategy and security processes into a firm knot is the only way to stand straight amidst the faced-paced, ever-changing storm of digital services. The role of the CISO is evolving faster than ever, and becoming the jack of all security and business trades. On Monday, they’re the superheroes keeping the cybercriminals out. On Tuesday, they’re improving the organization’s security posture. By the end of the week they’re C-suite ambassadors and innovating the concept of security, all while delivering positive business value. As the role continues to evolve and the CISO’s depth and breadth of knowledge regarding the business, its underlying technology and its core risks, the role will continue to elevate outside of IT and be seen as a peer of the CIO.


Is DeFi the future of financial infrastructure and money?

DeFi apps could benefit by borrowing some of the legacy concepts, particularly in terms of compliance and consumer experience. For example, they could definitely make the front-end of these apps a much better customer experience for end users. The DeFi space also doesn’t really have a concept of customer relationship management nor typically collects any amount of consumer data. While great from a privacy perspective, there is great value in understanding the customer better. There are security audits DeFi products do, but they feature none of the security guarantees most consumers are accustomed to in the traditional financial world. Notifications or alerts also don’t really exist at all in the DeFi space. In terms of products, there are tools to measure blockchain activity, but not to measure engagement within DeFi applications. Most of the developers in the crypto space are building right on top of the layer one protocol itself. There aren’t any concepts of developer platforms or middleware yet. In traditional finance, if you make a mistake, a financial institution can initiate a rollback of the transaction – this doesn’t exist at all in DeFi yet. 


How Blockchain and Cryptocurrency Can Revolutionize Businesses

Unlike traditional card payments, which can be reversed using the chargeback feature, Bitcoin and other cryptocurrency payments cannot be reversed. Because each transaction is securely recorded, there is a long-term audit trail that can be utilized to trace transactions and verify their authenticity. As a result, each transaction has greater audibility and accountability, dramatically reducing the likelihood of fraudulent transactions. This audibility feature can also be used to track other assets, allowing businesses to keep a database of various types of information about these assets up to date. Increased traceability of the supply chain The use of blockchain-based applications makes it easier to track products and goods as they move through different stages of the supply chain. The ability to monitor suppliers in real-time, eliminate human errors in data updating and use smart contracts for payments is expected to transform the global supply chain industry. With the supply chain becoming more efficient, organizations can shift their focus on cutting down other costs and more efficiently streamlining other processes, including production.


The lighthouse signals a digital disruption storm

In BFSI, with remote working/collaboration and digital transactions, cybersecurity has become one of the main focus points during the COVID-19 pandemic, adds Rajdeep Saha, Managing Director, Financial Services – Technology Consulting Practice, Protiviti India. “In this context, secure access service edge (SASE) and Zero Trust model can help create a single cloud-native security service, coupled with other enablers.” Tare recommends a host of measures underlining the new role that artificial intelligence (AI) and analytics would be playing ahead. “Cyberattack insurance is now available from several organisations. A bundle like this is a must for all financial institutions. Security analytics, machine learning, and artificial intelligence are some of the cutting-edge technologies that are helping strengthen the cyber defense mechanism. Before threats assault your infrastructure, the finest protection mechanisms detect and neutralise risks.” Implementation of PCI-DSS compliance, card payment security, and others have reduced the impact of cyber threats in financial institutions, he adds.


Is Cryptocurrency-Mining Malware Due for a Comeback?

Cryptocurrency mining refers to solving computationally intensive mathematical tasks. In the case of bitcoin, such tasks are used to verify the blockchain, or public ledger, of transactions. As an incentive, anyone who mines for cryptocurrency has a chance of getting some cryptocurrency back as a reward. But for bitcoin and some other types of cryptocurrency, the amount of reward decreases as more blocks get added. Mining can consume copious amounts of electricity - so much so, that some studies have found it would be cheaper to buy gold outright rather than obtain cryptocurrency via mining. Such calculations are always in flux, with the rise and fall in cryptocurrency value. But for attackers, the easiest approach is to have someone else pay for the power while they walk away with the cryptocurrency. ... The takeaway for security teams, as ever, is vigilance, because if attackers can sneak cryptominers onto an organization's systems - eating up processing power and racking up sky-high electricity bills - they might put something nastier there too.  



Quote for the day:

"The level of morale is a good barometer of how each of your people is experiencing your leadership." -- Danny Cox

Daily Tech Digest - July 17, 2021

Preparing The Office For The AI Revolution

Continuous learning should already be a key part of your culture. Training and developing awareness of AI capabilities and benefits is simply another string to the bow. It helps your business stay competitive, but it also feeds into employee engagement, retention and attracting new talent. It may be that you organize on-site workshops or for those who want it or subsidize an advanced degree program. Every employee is different, so it’ll take some mixing and matching to tailor your efforts to their preferred learning environments, but harnessing this learning and applying these new skills within your business not only nurtures talent, but it also fosters forward-thinking, equipping your people with the skills and the mentality — resilience even — to evolve with the ever-changing world of work. Your people are your most important asset — we all know this. Finding the right people for your business can be the difference between success and failure. However, finding the right combination of people and technology can give you an even greater competitive advantage one that you never even considered.


5 years until enterprise quantum, but your prep begins now

Taking a longer view of the enterprise quantum computing journey will yield greater benefit and business value. That means defining the link between your organization’s quantum program and broader business strategies. Ideally, set up a feedback loop where business strategies help to prioritize quantum program outcomes and the quantum program informs new strategies. ... Navigating the rapidly evolving quantum ecosystem is difficult, with an increasing variety of vendors, academic institutions, startups, and cloud-based service providers. Selecting the right company requires identifying which ones can understand your specific business use case(s) and provide the type of quantum technology you need for your specific strategy. One challenge can be around making long-term commitments to quantum partners. Hardware and software providers have until recently been focused on finding a few high-value strategic collaborators, as reflected in their access contracts or licensing terms, lengthy subscription models, or research programs. 


Ransomware attacks show we're getting clobbered on cybersecurity

The Colonial Pipeline and JBS attacks disrupt the traditional construct that makes IT attacks the playground of criminals and OT attacks the purview of hostile intelligence services. Criminals now have a taste of how lucrative an OT attack might be. Will we start to see criminals target OT infrastructure more often? That is certainly the concern of law enforcement and the intelligence community. Attacks on infrastructure by criminals who aren’t deterred by diplomatic niceties changes the game. This concern was reflected in announcements on Friday by the Department of Justice that ransomware extortions will immediately move up the priority stack to equal terrorism. FBI Director Christopher Wray followed up by likening the surge in ransomware hacks to 9/11. These are startling statements that wouldn’t have been issued without the Colonial Pipeline attack. Interesting what some gas station waiting lines will trigger. Lurking behind all this is a scenario that has our protectors reaching for antacids. Are enemy intelligence services making moves to leverage criminal organizations to attack OT infrastructure, with all its debilitating effects, in a kind of proxy cyber war?


Linux Variant of HelloKitty Ransomware Targets VMware ESXi Servers

The days when Linux, Unix and other Unix-like computer operating systems weren’t typically targeted by malware authors are long gone. It might well have been the case that attackers used to prefer bedeviling Windows systems, given that Windows instances are far more widespare than Linux instances. As well, Linux instances are generally well-protected against vulnerabilities, thanks to a tight-knit user base that delivers fast security updates. Andrew Barratt, managing principal of solutions and investigations at cybersecurity advisory firm Coalfire, told Theatpost on Friday that we said goodbye to the days when malware didn’t target Linux “a long time ago,” but that change was typically server-side and hence not particularly visible to the public. “With the rise of Mac OS on the desktop and its underlying infrastructure being based on BSD – everyone’s favourite ‘hard nix’ – there has been a correlation in *nix based malware as attackers target the Apple end user,” Barratt said in an email, *nix being shorthand for any Unix, Linux or other Unix-like systems. 


Assessing the Feasibility of Research and Data Science Projects

Estimating the probability of technical success — designated as P(T) from now on — is an exercise in identifying the salient risk factors, then scoring each individually on a consistent scale before plugging the values into a mathematical function. Easy. But what exactly is “technical success”? The definition of P(T) is as follows: a likelihood that, given existing data, available technologies, required research, and development, the prototype will be completed and will perform the defined function in a curated laboratory or constrained production environment. P(T) is given as a percentage. The definition is important as it sets expectations for the deliverables. We limit the scope of P(T) to the prototype designed to test the technologies and convince a moderate sceptic that the approach works. The tests should run in a limited but representative case and use a vetted data set. Why the limitations? They decrease the time needed to develop the prototype without compromising the evaluation. They reduce the chance of misalignment with final goals by facilitating minor, continual adjustments.


Top 10 Low-Code App Development Best Practices to Follow

It’s a better idea to put an imperfect application in production than it is to release a fully-baked one. Succeeding with low-code means dividing your application into meaningful chunks and publishing them as often as you can. The idea is to constantly get immediate field feedback from users and make consistent small improvements. Conduct regular feedback sessions with your team members and see how the app operates and what it lacks. ... Low-code platforms provide prepared UI libraries with consistent components. They’re easy to use, and it’s a good practice to create a simple platform that borrows the best features from market leaders. It will take less time and money than creating a unique UI/UX and will allow you to release your app faster. In our experience, UX specialists bring value at the beginning of the project, but later their role decreases significantly, and on the off chance that UX and Visual Design support is needed later on, experts can contribute on an as-needed basis. ... To continue developing a good product, you should always communicate with the low-code community and your users. 


The ICE Blockchain

The purpose of this new blockchain is the much-needed addition of EVM compatibility to our ecosystem. We believe EVM compatibility is paramount for the growth of the ICON Network (see Polygon’s recent success as a good indicator of its importance). It enables developers to iterate and innovate much faster using existing tooling and code from the Ethereum ecosystem and allows for increased growth prospects of ICON given easier portability of applications. Additionally, with this new chain, we are also adding eWASM support that puts us ahead of the game, preparing us for ETH 2.0 compatibility. ... The ICE blockchain will start as a Proof of Authority network, with trusted validators operating the chain. After the network is stabilized, it will migrate to NPoS, the consensus mechanism commonly used by other Substrate SDK networks. NPoS is a variation of DPoS (ICON Network’s current consensus mechanism), where token holders vote for validators to produce blocks and govern the network. Smart contracts on the ICE Network will be written in Solidity, Rust, and C++. Fees for smart contract interactions, smart contract deployments, and token transfers will be paid in ICE tokens.


Physical threats increase as employees return to the office

“As corporations advance the digital transformation of their physical security operations, it has never been more important to seize the opportunity to align their cybersecurity operations and infrastructure,” said Lukas Quanstrom, CEO of Ontic. “Bringing together all threat data and intelligence in an always-on, technology-driven approach to security is the most effective way to advance business continuity in today’s increasingly hyper-connected, hyper-violent environment.” ... Of the physical threats that resulted in harm or death at companies in 2021, 49% of respondents think most or almost all could have been avoided if cybersecurity and physical security intelligence were unified so threats could be shared and actioned by cross-functional teams. 55% of physical security and IT leaders say their CEO believes training employees so they are better prepared for potential workplace violence will create a culture of fear. 26% of those surveyed say their company has never addressed the potential for workplace violence and employees would not know what to do if an active shooter entered their facilities.

First, repeated coding vulnerabilities caught by the security team create frustration for both groups, causing dissension that can impede innovation. Further, the cycle of coding, testing, fixing and then retesting for vulnerabilities delays application releases and extends time to value. And when vulnerabilities lead to successful cyberattacks, everyone is frustrated. The legal and compliance teams face regulatory repercussions. Customers and partners lose trust in the organization. And the C-suite and board see the full scope of the ramifications: time and money wasted, privacy violations and fines, loss of intellectual property, customer and partner attrition, and lawsuits and brand damage. As a result, ensuring secure code from the very beginning of the software development life cycle cannot be ignored as an essential part of a comprehensive cybersecurity defense strategy. So how can you close this security gap, eliminate application vulnerabilities at the source and make everyone happy? First, recognize there are no bad guys on your defense team. This is a problem of process — that is, ensuring the right people have the right skills and access to the right information.


Decentralized Finance: The Next Big Threat for the Finance Sector

Decentralized finance, or DeFi, is a fast-growing segment of the financial markets. Based on a blockchain platform, DeFi provides software services that can cut out intermediaries in financial transactions, thereby allowing for financial services, such as mortgages and investment, to be delivered at lower costs. The question is: Will it take off, or will the financial sector push back? ... First of all, there are a host of technical risks and concerns about attacks and hacks that have been very significant in DeFi. There have been hundreds of millions of dollars lost because these systems are not sufficiently mature, robust and resilient. ... There are legal risks as well, where regulators appropriately have concerns about things like money laundering and fraud that are going on in the larger blockchain and cryptocurrency world and as well as in DeFi specifically. The value of these DeFi services is that they are decentralized, so there’s not one actor that is responsible for all of the transactional activity.



Quote for the day:

"It's not about how smart you are--it's about capturing minds." -- Richie Norton

Daily Tech Digest - July 16, 2021

Apps Built Better: Why DevSecOps is Your Security Team’s Silver Bullet

Security should never be an afterthought when developing software and applications. However, as technological advances continue to take hold, the security tools many rely on are changing in real-time, and combatting potential breaches or hacks of your wares before they arise now requires new strategies. To stay one step ahead of bad actors, it’s time to reimagine the application-development process from “DevOps” to “DevSecOps.” DevOps is an evolution of the agile development lifecycle for software that bridges the gap between development and operations teams. It breaks down silos and increases an organization’s ability to deliver applications and services faster than traditional software-development models. Previously, the traditional “waterfall method” required long cycles upfront and resulted in strenuous and tedious processes – and by the time these solutions were deemed ready for release, the market had likely changed. Today, agile software teams have release cycles that are a matter of days or hours, which increases the risk of making mistakes and introducing vulnerabilities.


Organizing Complexity

Suppose complex systems can be described as the sum of simpler components. In that case, the complexity of the simpler components affects the parent system exponentially. The parent system will be much more unpredictable if each individual part becomes less predictable. Remember that experts on a complex system can completely understand the simpler components which combine to create the larger system. This means that each individual piece, large or small, must reach a point where it can be thoroughly understood. In addition to managing the complexity of simpler components, these components must also be easily accessible by the system as a whole. Let's go back to our authentication system. Here, we have a perfectly isolated part that predictably handles a specific authentication format and flawlessly retrieves identity information from a single database. Our authentication system is of little use if it can't pass the data to the appropriate authorization system. Likewise, suppose it can give the data, but the authorization system cannot receive the information. In that case, it is also of little use. 


You have a remote workforce – what now?

Unsurprisingly, enterprises may start to look at direct connectivity to the applications themselves. Establishing direct and private connections to your cloud provider can help alleviate some of the new demands on a corporate network caused by remote working. When struggling with maintenance challenges, one approach could be adopting a network model that is flexible and scalable. This is particularly beneficial in the face of unpredictability and potential disruption with connections and bandwidth that can be set up, flexed up, and torn down in near real-time. That’s again where a service that securely connects data centres, clouds, applications and business partners can play a role. By using technologies that interconnect networks and clouds in real-time, businesses can self-provision redundant network links – for a day, a week or even a month – that can be used to carry traffic during maintenance. Furthermore, it is critical to have transparency and visibility over the whole network, allowing IT managers to ensure that business applications are all up-to-date and that employees are running the most secure version. As more and more workers pick up these tools a harsher spotlight has been cast on their suitability.


How to Attract More Computer Science Grads to the Cybersecurity Field

Consider the freshly minted computer science degree holder who comes across a job posting for a cybersecurity "solutions architect" or "presales engineer." Their first reaction might be, "That's a sales position; it doesn't match the skills I've spent the last four (or more) years of my life acquiring." According to the ESG/ISA report, CISOs are doing little to debunk that misperception by only looking for candidates with narrow technical skill sets at the expense of other necessary qualifications. "This may reveal that few CISOs have the blend of business, leadership, communications, and technical skills necessary for success," wrote the report's authors. "CISOs are business, not technical, leaders." Consider the role of a solutions architect. It requires the right mix of social and interpersonal skills and technical strengths. I must be able to dive into the technical weeds with some people and also speak in non-technical terms to employees and their managers. ... So what's the path forward? For college students and midcareer professionals, explore opportunities fully rather than simply reading the title of a job description. 


What is Network as a Service (NaaS)?

It's clear that enterprise customers' move to cloud services is the primary driver for NaaS. And while the networking industry is only now sorting out how the cloud world will be effectively networked, NaaS could play a big role in the future. Networking must be virtualized and automated to enable operations at cloud speed, Pluribus's Gill said. "The big question is what services and business models will prevail. With most applications staying in private cloud environments, private cloud NaaS may be the most important." Another key market opportunity will come from NaaS offerings that simplify use of multiple public clouds, so users do not have to be experts in each cloud's native networking stack to operate a multi-cloud environment, Gil said. "With more and more applications moving to Cloud/SaaS, traffic profiles are shifting dramatically," Anderson said. "We used to build campus networks with tremendous aggregation back to a core network – then to our private data center where our app workloads ran. In the future, if most traffic is headed to Cloud/SaaS, are core networks needed anymore?


Reducing the Computational Cost of Deep Reinforcement Learning Research

It is widely accepted that the enormous growth of deep reinforcement learning research, which combines traditional reinforcement learning with deep neural networks, began with the publication of the seminal DQN algorithm. This paper demonstrated the potential of this combination, showing that it could produce agents that could play a number of Atari 2600 games very effectively. Since then, there have been several approaches that have built on and improved the original DQN. The popular Rainbow algorithm combined a number of these recent advances to achieve state-of-the-art performance on the ALE benchmark. This advance, however, came at a very high computational cost, which has the unfortunate side effect of widening the gap between those with ample access to computational resources and those without. In “Revisiting Rainbow: Promoting more Insightful and Inclusive Deep Reinforcement Learning Research”, to be presented at ICML 2021, we revisit this algorithm on a set of small- and medium-sized tasks. 


IBM shows quantum computers can solve these problems that classical computers find hard

Big Blue's scientists developed a new classification task and found that a quantum algorithm using the quantum kernel method is capable of finding relevant features in the data for accurate labeling, while for classical computers the dataset looked like random noise. "The quantum kernel estimation routine we use is a general method that can be in principle applied to a wide range of problems," Kristan Temme, researcher at IBM Quantum, tells ZDNet. "In our paper, we formally prove that this quantum kernel estimation routine can give rise to learning algorithms that for specific problems outperform any classical learner." To prove the advantage that the quantum method has over the classical approach, the researchers created a classification problem for which the data can be generated on a classical computer, and showed that no classical algorithm can do better than random guessing when attempting to solve the problem. When viewing the data in a quantum feature map, however, the quantum algorithm was able to predict the labels with high accuracy and at speed.


The rise of the Developer Experience Engineer, and why it matters

Engineering teams need a leader, a Developer Experience Engineer, who ensures developers have the right tools, processes, and environment to maximize productivity and create the greatest business value possible. The DXE is the foundation for engineering team success. They make it easy for developer teams to focus on their highest purpose and generate the highest value by solving, automating, and eliminating the daily toil developers encounter. They are a major unlocking force that boosts teams to new heights. A DX owner or function isn’t a new idea. Twitter formed an “engineering effectiveness” organization in 2014, and Google has a massive “engineering productivity” team. In many companies, however, the role emerges organically and is growing. DXEs implement a common set of principles, maintain the right tools, and create cohesive standards that clear the path to developer success. Without them, too many languages, frameworks, engineering styles, and processes can add drag. A drag on developers is a drag on the business.


Keeping a cohesive and engaged IT team during a pandemic

Like most things in the world of IT, ultimately it boils down to communication. No matter the awesome technology and tools we have, every problem comes down to being a people problem. I don’t mean more meetings - goodness no - but during the meetings we already have make sure to find some time for small talk about hobbies and interests, and absolutely ensure successes and victories and personal milestones are made known and celebrated among the team. The right tools also must come into play; Zoom, GoToMeeting, Google Meet, AWS Chime, and the rest are brilliant tools but video meeting after video meeting stifles productivity and drains people. Instead, consider the other array of tools at your disposal - casual chat applications, project management software, knowledge bases and documentation, task tracking systems, and the like. Harness these to keep your team connected. Consider also the creative options that build up your virtual culture with good-hearted fun that bonds people together. Some companies have brought in a weekly Netflix watch party, others a Tik Tok-style challenge, and some play Jackbox games.


Why Data Science Might Not Be The Right Career For You?

Most people go into data science for the adventure it offers. However, the reality is slightly different. “In most organisations, you’ll have to spread your time between doing technical work and the other, less exciting stuff, ” said Adam Sroka, head of machine learning engineering at Origami. So, if you are not keen on reporting, writing, documenting and delivering presentations, or repeatedly explaining the basics of your models or techniques, project management, administrative overhead, etc to the stakeholders, then the job might not be a right fit for you. Candidates coming from an education or research background often fall into the trap of infinite timescale and infinite budget mindset. “All too often, I have heard protests from data scientists saying they can not put a ‘timeline’ on when their work will be finished, and it will take as long as it takes. This simply is not true and won’t fit well with the culture at most organisations,” said Sroka. ... Communication is pivotal to forge a successful career in data science. For instance, if you are working closely with the company’s decision-makers, maintaining a solid relationship is essential.



Quote for the day:

"It is, after all, the responsibility of the expert to operate the familiar and that of the leader to transcend it." -- Henry A. Kissinger

Daily Tech Digest - July 15, 2021

Why Developers Should Learn Kubernetes

Along with DevOps and SRE adoption, there is also a lot of discussion about “shifting left” in the software development world. At its core, shifting left means focusing on moving problem detection and prevention earlier in the software development lifecycle (SDLC) to improve overall quality. More robust, automated continuous integration/continuous delivery (CI/CD) pipelines and testing practices are prime examples of how this works. Shifting left applies to operational best practices as well. Once upon a time, developers would code their applications and then hand them off to operations to deploy into production. Things have changed dramatically since that time, and old models don’t work the way they once did. Knowing about the platform that the application lives on is critical. Successful engineering organizations work hard to ensure development and operations teams avoid working in silos. Instead, they aim to collaborate earlier in the software development lifecycle so that coding, building, testing and deployments are all well understood by all teams involved in the process.


Top 5 Programming Languages for Automation Testing

JavaScript focuses strongly on test automation and performs well when it comes to rebranding the client-side expectations through front-end development. Unavoidably, there are many web applications like Instagram, Accenture, Slack, and Airbnb which support libraries written through JavaScript automation, such as instauto, ATOM (Accenture Test Automation Open source Modular Libraries), Botkit, and Mavericks. Besides, there are various testing frameworks like Zest, Jasmine, and Nightwatch JS which refine multiple processes of unit testing as well as end-to-end testing. The reason for using them is that programmers or developers may build strong web applications primarily focusing on the core logic of businesses and quickly resolving security-related issues that may occur anywhere and anytime. With such advantages, teams working for automation testing won’t feel pressured because the debugging time and other code glitches are reduced and the productivity is promisingly increased with the shift-left testing approach. 


Trickbot Malware Rebounds with Virtual-Desktop Espionage Module

The latest version of the spy module makes use of virtual network computing (VNC): hence its name, vncDll. It essentially sets up a virtual desktop that mirrors the desktop of a victim machine and sets about using it to steal information. It’s been circulating since late May, researchers said. When first installed, vncDll uses a custom communications protocol to transmit information to and from one of the up to nine C2 servers that are defined in its configuration file. The module will use the first one to which it can connect. “The port used to communicate with the servers is 443, to avoid arousing the suspicion of anyone observing the traffic,” according to the Bitdefender analysis. “Although traffic on this port normally uses SSL or TLS, the data is sent unencrypted.” The first order of business is to announce to the C2 server that it’s been installed, and it then waits to receive a set of commands. The C2 connects to an attacker-controlled client, which is a software application that the attackers use to interact with the victims through the C2 servers. 


Four common biases in boardroom culture

Boards can be effective only if they can come to a consensus. Let’s say a company is considering the launch of a significant new product, but five of the 12 directors have concerns going into a meeting on the topic. Some have discussed the issue among themselves before the meeting. Many are worried about how the full board discussion will go. In the meeting, one director starts to share his concerns, but the CEO quickly moves on. Over the course of the meeting, more and more heads start to nod along. No parts of the strategy for this new product have changed. But now the entire board appears supportive, including the director whose concerns were dismissed. Though consensus-building is important, boards may be too inclined to seek harmony or conformity. This can lead to groupthink, a much-written-about challenge facing companies in which dissenting views are not welcomed or even entertained. In fact, though most boards work to solicit a range of views and come to a consensus on key issues, the 2020 edition of PwC’s Annual Corporate Directors Survey found that 36% of directors have difficulty voicing a dissenting view on at least one topic in the boardroom. 


Moving Data is Expensive

Data created at the edge must be accessed and processed by the applications in the datacenter. The necessity to move data to the application incurs a productivity penalty. Take media and entertainment: editors, colorists, and special effects artists in multiple locations may sit idle waiting for data to become accessible. A 30 minute delay across 200 animators may result in ~$400K unintended cost. Data may have to be moved multiple times, each time incurring the productivity penalty. Every time data is moved or copied, storage resources must be made available to store it. Whether it is persistent storage or a caching device, disk drives are deployed to catch data being sent. Moving 10TB requires 10TB of storage to be available in every location requiring data access. The cost of storage varies from $120/TB/yr for archiving tier to $720/TB/yr for high-performance tier. Every copy created incurs an added storage cost. These estimates are marginally accurate; procuring small amounts of storage may be even more costly since economies of scale kick in at over 40TB. 


How to Best Assess Your Security Posture

Risk assessment can help an organization figure out what assets it has, the ownership of those assets and everything down to patch management. It involves figuring out what you want to measure risk around because there are a bunch of different frameworks out there [such as] NIST and the Cyber Security Maturity Model, (C2M2)" said Bill Lawrence, CISO at risk management platform provider SecurityGate.io. "Then, in an iterative fashion, you want to take that initial baseline or snapshot to figure out how well or how poorly they're measuring up to certain criteria so you can make incremental or sometimes large improvements to systems to reduce risk. ... Looking at your own scorecard is a good way to get started and thinking about assessments because ultimately you're going to be assigning the same types of weights and risk factors to your vendors," said Mike Wilkes, CISO at cybersecurity ratings company SecurityScorecard. "We need to get beyond thinking that you're going to send out an Excel spreadsheet [questionnaire] once a year to your core vendors.


Leveraging data: what retailers can learn from Netflix

For bricks and mortar retail, collecting data on customers is obviously more difficult – they don’t need to ‘login’ in order to enter a shop. Retailers, however, can track credit cards to group transactions back to a specific customer, and use this data to link both online and offline sales. AI, when used smartly in store, can also help retailers to get to know their customers better. Connected devices and IoT, along with Computer Vision technology (CV), allows businesses to collect data from sensors, cameras and mobile devices on consumer behaviours. This can include the items that are picked up or put back down, the directions visitors move in, whether the shoppers are regulars, or which areas of the store are most visited. Analysing the data gathered by this suite of technologies can, in turn, help drive brand loyalty with a tailored in-store experience. Loyalty programmes continue to have a key role to play in supporting retailers with data capture (such as behavioural and transactional information), and analytics are helping loyalty schemes to become more powerful in driving sales than ever before. 


The real cost of MSSPs not implementing new tech

Taking on a complex cybersecurity landscape without the right tools can result in serious weaknesses that threaten an organization’s networks and data. Among the potential problem areas: The comprehension gap. - The lack of a translation layer between tactical and strategic stakeholders (i.e., those making reactive decisions and those who plan for the future) can result in separate tools and systems within an organization. This results in failures while making crucial, time-sensitive decisions, as well as in fully understanding the threat landscape and effectively allocating resources. A regulatory disconnect - Organizations need to balance collaborative cybersecurity efforts with compliance. Various regulations, such as the Federal Information Management Security Act, the General Data Protection Regulation (GDPR) or the California Privacy Rights Act (CPRA), tend to restrict the ability of security platforms to collect and share threat intelligence. Loss of time and momentum -  Without the right tools, security teams can find themselves besieged by a steady onslaught of low-impact events and security control system alerts 


Training NLP Engines Without All of the Answers

Natural Language Processing (NLP) or Natural Language Understanding (NLU) is a subset of Artificial Intelligence (AI). There are many benefits when using the technology, and I am surprised at the pushback from technical people when talking about deploying it. I guess there is a difference between learning about technology in academia and the complexity of actually deploying it. ... Another common over-promising statement is that it is easy to build the conversation and responses. In some cases, you build a simple decision matrix via the UI. After a while, you find out all of the variables in the conversation have created a mess. The other option is to create a machine learning (ML) model to look at data and provide observations and predictions. You might as well pull out your calculus textbooks and remember how all of this complex math works to build a ML algorithm. Building the ML is a specialized discipline in applied mathematics. Just because you can take a distance learning course does not mean you have the mathematics to build them. When asking a mathematician how long it will take to observe, hypothesize, and build an algorithm to try, they will tell you it takes time.


7 Key Insights of Product Management

There are inputs everywhere: feedback from customers, the team, leadership teams; quant data will tell us something and qual data will give us another insight. But are they all equal? Is the "customer always right?" Noooooooo, not necessarily. Using customers as an example: co-designing solutions can be dangerous, but they are good at helping you discover problems, so get them involved here. Good decisions come from proper weighting and attention to the inputs: the data, customer feedback, the market, your experience built from your track record, the team’s competence and so on. Again, it depends on what company, which product, what market. I’ve been a PM carrying almost everything from articulating and validating the initial idea through to writing FAQs and call scripts for the Customer Service Team. I’ve sometimes looked more like an Executive Producer, focused on the vision and strategy, galvanising multiple teams, suppliers and partners and engaging with a multitude of stakeholders. Perhaps you’re a Product Manager as well as a Product Marketer with your emphasis on positioning, pricing and Go-To-Market. 



Quote for the day:

"Leaders think and talk about the solutions. Followers think and talk about the problems." -- Brian Tracy

Daily Tech Digest - July 14, 2021

Future of testing: Why CART is making penetration testing & attack simulation tools outdated

The inherent challenges with traditional security solutions make a strong case for Continuous Automated Red Teaming (CART) - an emerging new technology which discovers the attack surface and launches safe attacks continuously. It also helps to prioritise the vulnerabilities that are most likely to be attacked, which are typically the path of least resistance. To put it simply, CART automates red teaming and is designed to scale the process and make it more efficient allowing for continuous discovery of one's attack surface and continuous testing. This makes CART a game changing strategy in cybersecurity. In addition, CART, unlike penetration testing, finds the attack surface automatically without any inputs. It then launches multiple-stage attacks that range from networks to applications to humans. And, unlike BAS, CART, uses an outside-in approach to attack and does not require any hardware or software. Although hackers are sophisticated and have advanced detection and prevention capabilities, CART can help organisations stay ahead of the game by helping them think like a hacker.


Can Government Effectively Help Businesses Fight Cybercrime?

While companies need to better defend themselves, the government can help them by recommending cybersecurity measures and passing along threat information and by taking actions to dissuade attackers, whether it is sanctions against collaborating countries, indictments against individuals, or offensive attacks against the infrastructure used by criminals and their financial windfalls, says Mark Montgomery, senior director of the Center on Cyber and Technology Innovation at the Foundation for Defense of Democracies (FDD) and the executive director of the Cyberspace Solarium Commission. "No one of them can solve it alone—you have to do all three," he says. "We need to be working consistently across all three of those lines of effort." The Ransomware Task Force recommended five policies: Coordinated diplomacy and law enforcement efforts, an aggressive whole-of-government campaign by the United States to dissuade ransomware groups, the establishment of cyber response funds to help business, an international framework for responding to ransomware, and more regulation of cryptocurrency.


Using technology to keep control of your digital footprint in a post-COVID world

The concerns associated with handing over our data to travel companies do not stem from nowhere. Airlines have a notorious reputation for facing data breaches. For instance, British Airways was fined £20 million by the Information Commissioner’s Office (ICO) after the personal information of more than 400,000 staff and customers was leaked, including sensitive data related to banking and payments. More recently, Air India suffered a data breach that leaked the private data of no 4.5 million customers – including their contact information, credit card details, passport and ticket information, and more. Airlines are not the only alarming companies involved, as they’ve been handling high-risk information for years. When restaurants and local bars gain access to healthcare information, they are likely to not have adequate security measures in place and are therefore at great risk for vulnerabilities that put the customers’ privacy in danger. Does that mean we shouldn’t travel due to data privacy concerns? Absolutely not. The solution is to embrace data ownership as part of a value-based internet experience.


Unilever CIO: Digital literacy is the most important new capability to develop

Digital literacy, as Ventura defines it, is the ability of one employee—or an entire culture—to embrace technology-driven innovation in changing the way they work. To develop this capability, Ventura and his team created a Digital Literacy Curriculum, which they are facilitating at the top of the organization with CEO Fabian Garcia, and will expand to include leaders in sales and marketing next. The Curriculum involves Ventura, key members of his IT leadership team, and select business partners who meet three times a month for 45 minutes for a total of 14 sessions. Those sessions focus on cloud, platforms, data, and product, and are organized by experiences: customer, consumer, and employee. “We’ve always talked about ‘applications’ and ‘systems,’ but today we are using so many new terms, that we want to ground these terms in the experiences that matter to our CEO,” says Ventura. “We include our business partners, because Fabian does not want to know about the technical details of a capability; he wants to know how a practitioner will turn insights into action.”


8 Fintech Trends Changing Banking Forever

Not only has the speed of payments been impacted, but the speed of payments innovation is increasing. Solutions like Buy Now, Pay Later (BNPL) have gained popularity almost overnight, creating a unique form of real-time point-of-sale financing. This new innovation has forced traditional banks and even competing payment providers to play catch-up. While most solutions are for short-term deferred payments, new alternatives have been created for larger purchases with longer-term installments. As a result, BNPL has impacted not just debit issuers, but also credit card and personal loan providers. As has been seen in other financial product areas, new payment innovations often have emerged faster than the regulations to protect consumers. As regulators sort through the risks to consumers associated with new solutions, financial institutions have an opportunity to create competing solutions with improved transparency and better risk/reward models. Open Banking has become one of the most important global trends in the banking ecosystem. Originated in the U.K., the concept lowers barriers to entry for alternative financial services providers and enhances the potential for innovation by mandating traditional financial institutions to share financial data through APIs.


Rebuilding your security culture as employees return to the office

People need to move data to get their work done, and it can be a natural instinct for security teams to respond negatively to data exfiltration alerts. However, Code42 research shows that most data leaks happen unintentionally. One example of this could be when someone accidentally exfiltrates data when they connect a personal drive to their work device, unintentionally synching work files onto their personal cloud. Instead of leaping to the conclusion that employees are stealing data, investigate to find out more. Often, they are simply trying to get work done or collaborate with a colleague or partner. Use these moments as an opportunity to educate them on more secure ways to share data, always beginning the conversation with positive intent. For example, start with “We noticed this… did you see it, too” rather than starting the conversation with an accusatory tone. Doing so will position them as security allies instead of security enemies, and that’s a better way to encourage them to work together with your security team. Emphasize the importance of security and why it matters to all employees as they return to the office.


Three things essential to the future of edge computing

At its core, edge computing relies on geographically disparate pieces of equipment being able to seamlessly talk with one another. This could be compute or storage nodes talking with one another, or those nodes talking with sensors or machinery that collect or action an edge network’s data. Edge infrastructure depends on those technologies being able to reliably interact. Geographic separation has also led to a tendency towards a diversity in equipment. Whether due to supplier availability or adaptations to the local area, the most efficient edge infrastructure is one that can accommodate a variety of technologies. In practice, the marketplace pressures to accommodate this is often inevitable for many larger operators of edge networks, especially for those that wish to avoid lock-in with a particular vendor. To make a diverse and disparate edge network viable, organisations need to adopt open technologies. Creating standards around open source software and hardware to ensure that they can interact via open source solutions is ultimately the only way to guarantee that every component in a diverse and distributed edge network can interact with its counterparts.


Computer vision adoption expected to grow significantly in the near future

Manufacturers typically implement CV for quality control and process optimization, using systems to perform inspections with greater accuracy and at higher speeds than human workers, he said. "Beyond the production line, these systems have significant potential to augment or automate tedious, dangerous or expensive work, such as routine cycle counts and equipment inspections," Aigonkar said. CV is also useful for security in warehouse environments. In retail, CV is often applied for inventory optimization and to improve customer experience, ensuring that products are properly stocked and to monitor checkout lines, curbside pickups, and to keep an eye out for product spills, he said. "We see major successes with CV implementation across sectors–in utilities, transportation, manufacturing and production, retail and healthcare," Ajgaonkar said. In all of these verticals, the use of CV improves efficiency to free up employees to focus on more mission-critical tasks, he said. The increased adoption of AI and the internet of things proliferating across industries, is making CV something organizations should pay attention to now, he said.


6 IT talent retention strategies: Chicago CIO of the Year winners share

Without a doubt, the post-pandemic world has accelerated digitalization and the emerging hybrid work environment. To effectively compete and win in the marketplace, companies across all industries must execute strategies faster and pivot rapidly to seize new opportunities. Talent is central to success. My organization has implemented workforce initiatives to motivate and increase engagement among employees. We are ensuring that employee contributions are aligned to business priorities to provide a sense of purpose and meaning. We are providing schedule and location flexibility and investing in career growth. We have stepped up communication and involve a broad cross-section of employees in discussions to shape the future of the workplace. ... The new remote reality has indeed placed a brighter-than-ever spotlight on the importance of retaining top talent. I believe in investing in our people so they can learn new skills, which not only enrich and challenge them personally, but also enable them to contribute to our business success. Our mission is to deliver secure, reliable payments to the whole country. Knowing that the day-to-day work our team members do impacts every household and every business in the country is incredibly motivating and rewarding.


Global frameworks the way forward for AI and data privacy — Google CEO

Artificial intelligence (AI) has been quickly evolving, playing a gradually larger role in people’s lives. Looking forward to the next quarter of a century, given the evolution that the Internet has made in this time frame, Pichai expressed belief that while AI is still in its early stages, people will need to ensure that the technology develops in a way that benefits society. “I expect [AI] to play a foundational role across every aspect of our lives, be it healthcare, education, how we manufacture things, and how we consume information,” he said. “Today, it’s already changing our lives in simpler ways. In healthcare, when a radiologist is doing scans, [AI] may be acting as an assistant, flagging where [the radiologist] may want to give an extra look, or prioritise, because it looks worrisome. “Over time, we’ll be with more intelligent systems, and it can make humans more productive than we’ve ever imagined.” When asked whether society is unprepared for the rise in AI, Pichai said that while this may partly be true, human potential is always



Quote for the day:

"Without growth, organizations struggle to add talented people. Without talented people, organizations struggle to grow." -- Ray Attiyah

Daily Tech Digest - July 13, 2021

The biggest data science trends in banking

Another rising data science trend within banking is the use of traceable timing solutions. With timestamping regulations in financial services getting stricter, and data scientists looking to maintain integrity of assets, these solutions look to improve the accuracy of time sources. Richard Hoptroff, CTO and founder of Hoptroff, said: “Network derived and precise traceable timing solutions are an innovative aide which can be used to bolster data infrastructure of banks. They are a growing alternative to traditional, satellite dependant means of achieving time. Network derived time can be used to optimise trade lifecycle management, improve transaction reporting and inform strategic decisions. This enables the verification of transactions to become more efficient and reliable and also opens up the possibility to identify significant cost savings. “The implications of the Fourth Industrial Revolution mean that traceable timing solutions are becoming increasingly relevant outside of financial services – as posited by Brad Casemore, the vice-president of Datacenter Networks at IDC, ‘Time and time services are more ubiquitous and more valuable today than many business leaders realise’.”


5 things cybersecurity leaders need to know to make hybrid work safe

Companies need to buy into zero trust, or the philosophy that organizations should not trust anything inside or outside their network. Even the savviest internet users are targets for phishing scams and require constant education to sidestep hackers targeting vulnerabilities. Zero trust can manifest as a suite of programs to prevent phishing. Web users must keep up-to-date on common scams, such as suspicious links and misspelled email addresses, to avoid putting the workplace at risk. Companies can install software to block downloading external software and monitor how devices can be used. Cryptographic systems such as two-factor authentication (2FA) could be key to staying secure, Cerf and Rashid agreed. That could take the form of an app on a smartphone or a physical cryptographic device. Employers can also apply these principles to working in the office. There may come a time where the corporate network is compromised, and it is crucial that security teams assume their networks could be exposed. Cerf predicts 2021 will bring expanded internet coverage in rural areas and increased 5G speeds and capabilities. 


Vulnerability in Schneider Electric PLCs allows for undetectable remote takeover

Once leaked, attackers can use the stolen hash to take over the secure connection that UMAS establishes between the PLC and its managing workstation, allowing the attacker to reconfigure the PLC without needing to know a password. Reconfiguration, in turn, allows the attacker to perform remote code execution attacks, including installation of malware and steps to obfuscate their presence. Schneider Electric said it applauds security researchers like Armis and has been working with the company to validate its claims and determine remediation steps. "Our mutual findings demonstrate that while the discovered vulnerabilities affect Schneider Electric offers, it is possible to mitigate the potential impacts by following standard guidance, specific instructions; and in some cases, the fixes provided by Schneider Electric to remove the vulnerability," Schneider said in a statement. Industrial control systems vulnerabilities have been a rising problem in recent years, but it's important to note that just because PLCs like Schneder's Modicon line are vulnerable doesn't mean an attacker will have an easy time taking control of them.


For The First Time, Scientists Have Connected a Superconductor to a Semiconductor

Ultrathin semiconductors like the one used here are currently a hot investigation topic for researchers: they can be stacked together to form entirely new synthetic materials known as van der Waals heterostructures. These structures have a lot of potentially innovative uses, such as being able to control electron magnetism with electric fields. However, a lot of this potential is still theoretical, because scientists just don't know what effects they're going to get yet and what devices they might be able to make. Which is why succeeding in creating this latest combination is so important. ... Getting this semiconductor-superconductor link together isn't easy – as you would expect, considering no one has done it before. The semiconductor is placed in a sandwich, with insulating layers above and below, while holes etched in the top of the insulating layer provide the electrical contact access. The superconducting material fills the gaps left by the holes, and the process is finished inside a nitrogen-filled glove box to protect the finished system from damage. Remote-controlled micromanipulators are used to complete the fabrication, under an optical microscope.


Financial services unchained: The ongoing rise of open financial data

Open financial data could put powerful non-bank companies in a stronger position to become financial-services players. With digital adoption leaping ahead by years in just several months,4 many ecommerce, tech, and social-media companies have accumulated a massive lead in customer attention. This opens the possibility for them to be the first port of call for new financial products and services to their user bases, similar to what Google now enables customers to do with its “Plex” product, connected to the Google Pay app. According to the Google web site, Plex is offered in partnership with 11 banks and credit unions and includes physical and virtual debit cards, peer-to-peer payments, and an associated checking account. In Singapore, the government recently issued banking licenses to five nonbanking players, including the consumer ecosystem Grab (200 million users in eight countries) and the consumer internet company Sea. The surge in online activity and digital behaviors has also opened up new avenues for companies to integrate financial services directly into customers’ daily activities, such as online shopping and the management of payments related to cars.


Dutch prosecutor ordered to give evidence on EncroChat hack

The decision is the first time an official has been required to explain the role of the Netherlands in the operation to hack EncroChat, which has led to arrests worldwide of hundreds of members of organised crime groups. The Dutch Public Prosecution Service’s public position is that it was not involved in the development or deployment of a “software implant” used by the French Gendarmerie to harvest 120 million messages from the phones, which were largely used by organised criminal groups. Dutch prosecutors argue that it is not up to the Dutch courts to assess the legality of the French police operation to intercept messages from EncroChat, which were subsequently shared with the Netherlands, the UK, Sweden and other countries. But the claim has been questioned by defence lawyers in the Netherlands, who point to evidence from the UK and elsewhere that suggests the Dutch and French Gendarmerie worked closely together on the operation. A court in Den Bosch ruled last week that a public prosecutor involved in the Dutch investigation into EncroChat, codenamed 26Lemont, should give evidence on the Dutch judiciary’s role in the operation with the French.


Moving away from ReactJs and VueJs on front-end using Clean Architecture

One of the biggest problems when it comes to evolving an application is the coupling to the UI framework. On the front-end little by little due to the responsibilities that applications of this type have been gained over time, it makes more and more sense to develop in a more structured way and the problems to be solved are very similar to those that exist on other fronts such as backend or mobile development. There are frameworks like ReactJs and VueJs that make life easier for us to take on these challenges on the front-end. A front-end application today is an independent application of the backend in many cases and therefore needs to have its own architecture. ... Using the UI framework for what is strictly necessary allows us to better adapt to changes in future versions of the framework. This is because the code that contains the application logic, which is the most important part, that changes less over time, and is the code potentially to be shared between two versions of the same app as in this example, it’s uncoupled without depending on the UI framework.


Data Scientists and ML Engineers Are Luxury Employees

First, machine learning and data science are fascinating fields. Mostly because they sit at the crossroad of computer science, mathematics, and business understanding. This means that there is way more room for personal growth. When switching from software engineering to machine learning engineering, my backyard suddenly tripled in size. I could continue to craft beautiful and complex systems with my hands (and become the great engineer I wanted to be) and at the same time satisfy (way more) my intellectual curiosity. Now every day, I get to learn and sit on the shoulder of giants! Apart from the interest in the field, another main reason is a bit more practical. I have spent so much time and energy learning the necessary topics (think probability, statistics, calculus, linear algebra, distributed computing, machine learning, deep learning…) that I want this knowledge to stick in. And we are all humans. Even if you are a genius, if you don’t practice what you learn, the knowledge goes away. So when your boss asks you (for the tenth time in a row) to create a piece of software or an analysis that has nothing to do with machine learning, what is that you think? Are you happy?


Blockchain 'Immutability' Dispute Sparked by Ethereum Request for Reorg Contract

Conversations and debates concerning blockchain immutability have been taking place for years, and a fresh new discussion has ignited over a tweet and smart contract published on Github by the developer Bunny Girl. On July 10, Bunny Girl explained that the smart contract aims to enhance systematic chain reorganizations. A blockchain reorganization is a contentious subject and basically occurs when a chain of recorded blocks is invalidated. Reorganizations have taken place on various blockchains when a mining entity or group of miners controls more than 51% of the hashrate. Blockchain reorganizations force miners (not participating in the reorg) back to a point where they have to start again from a specific block height. It’s akin to rolling back a recorded history of transactions and then re-recording them again, but of course, the new transactions would never be the same as the ones that were erased. ... The RFR thread was followed by an extremely mixed reception. “So we just ignoring immutability now?” asked one person in response to Bunny Girl’s tweetstorm. 


IOTA - The Most Accessible DLT Network for NFTs

In recent months, NFTs have been a major talking point among tech heads, industry experts and the art world. Even traditional, non-industry media has covered the feverish debate surrounding the possibility of digitally representing assets on a distributed ledger and trading them as so-called NFTs on dedicated platforms. NFTs are cryptographic tokens that represent something unique (or ‘non-fungible’) in digitized form. Virtually anything can be digitized and represented as an NFT. You can represent and trade digital assets such as digital artworks, video clips, music or gaming items as NFTs, as well as physical assets such as real estate, paintings, or vintage cars. Thanks to the underlying DLT, ownership of an NFT is digitally certified and can only be changed (either by transferring it or destroying it) by the owner. Many new use cases are only possible by means of the digitized representation of digital asset ownership. .., NFTs maintain their uniqueness, which makes them an excellent solution for digitizing the collectibles market, where there is proof of ownership and protection from copyright infringement.



Quote for the day:

"The meeting of two personalities is like the contact of two chemical substances: if there is any reaction, both are transformed." -- Carl Jung