Showing posts with label operating systems. Show all posts
Showing posts with label operating systems. Show all posts

Daily Tech Digest - September 24, 2025


Quote for the day:

"Great leaders do not desire to lead but to serve." -- Myles Munroe


Managing Technical Debt the Right Way

Here’s the uncomfortable truth: most executives don’t care about technical purity, but they do care about value leakage. If your team can’t deliver new features fast enough, if outages are too frequent, if security holes are piling up, that is financial debt—just wearing a hoodie instead of a suit. The BTABoK approach is to make debt visible in the same way accountants handle real liabilities. Use canvases, views, and roadmaps to connect the hidden cost of debt to business outcomes. Translate debt into velocity lost, time to market, and risk exposure. Then prioritize it just like any other investment. ... If your architects can’t tie debt decisions to value, risk, and strategy, then they’re not yet professionals. Training and certification are not about passing an exam. They are about proving you can handle debt like a surgeon handles risk—deliberately, transparently, and with the trust of society. ... Let’s not sugarcoat it: some executives will always see debt as “nerd whining.” But when you put it into the lifecycle, into the transformation plan, and onto the balance sheet, it becomes a business issue. This is the same lesson learned in finance: debt can be a powerful tool if managed, or a silent killer if ignored. BTABoK doesn’t give you magic bullets. It gives you a discipline and a language to make debt a first-class concern in architectural practice. The rest is courage—the courage to say no to shortcuts that aren’t really shortcuts, to show leadership the cost of delay, and to treat architectural decisions with the seriousness they deserve.


How National AI Clouds Undermine Democracy

The rapid spread of sovereign AI clouds unintentionally creates a new form of unchecked power. It combines state authority with corporate technology in unclear public-private partnerships. This combination centralizes surveillance and decision-making power, extending far beyond effective democratic oversight. The pursuit of national sovereignty undermines the civic sovereignty of individuals. ... The unique and overlooked danger is the rise of a permanent, unelected techno-bureaucracy. Unlike traditional government agencies, these hybrid entities are shielded from democratic pressures. Their technical complexity acts as a barrier against public understanding and journalistic inquiry. ... no sovereign cloud should operate without a corresponding legislative data charter. This charter, passed by the national legislature, must clearly define citizens' rights against algorithmic discrimination, set explicit limits on data use, and create transparent processes for individuals harmed by the system. It should recognize data portability as an essential right, not just a technical feature. ... every sovereign AI initiative should be mandated to serve the public good. These systems must legally demonstrate that they fulfill publicly defined goals, with their performance measured and reported openly. This directs the significant power of AI toward applications that benefit the public, such as enhancing healthcare outcomes or building climate resilience.


IT’s renaissance risks losing steam

IT-enabled value creation will etiolate without the sustained light of stakeholder attention. CIOs need to manage IT signals, symbols, and suppositions with an eye toward recapturing stakeholder headspace. Every IT employee needs to get busy defanging the devouring demons of apathy and ignorance surrounding IT operations today. ... We need to move beyond our “hero on horseback” obsession with single actors. Instead we need to return our efforts forcefully to l’histoire des mentalités — the study of the mental universe of ordinary people. How is l’homme moyen sensual (the man on the street) dealing with the technological choices arrayed before him? ... The IT pundits’ much discussed promise of “technology transformation” will never materialize if appropriate exothermic — i.e., behavior-inducing and energy creating — IT ideas have no mass following among those working at the screens around the world. ... As CIO, have you articulated a clear vision of what you want IT to achieve during your tenure? Have you calmed the anger of unmet expectations, repaired the wounds of system outages, alleviated the doubts about career paths, charted a filled-with-benefits road forward and embodied the hopes of all stakeholders? ... The cognitive elephant in the room that no one appears willing to talk about is the widespread technological illiteracy of the world’s population. 


How One Bad Password Ended a 158-Year-Old Business

KNP's story illustrates a weakness that continues to plague organizations across the globe. Research from Kaspersky analyzing 193 million compromised passwords found that 45% could be cracked by hackers within a minute. And when attackers can simply guess or quickly crack credentials, even the most established businesses become vulnerable. Individual security lapses can have organization-wide consequences that extend far beyond the person who chose "Password123" or left their birthday as their login credential. ... KNP's collapse demonstrates that ransomware attacks create consequences far beyond an immediate financial loss. Seven hundred families lost their primary income source. A company with nearly two centuries of history disappeared overnight. And Northamptonshire's economy lost a significant employer and service provider. For companies that survive ransomware attacks, reputational damage often compounds the initial blow. Organizations face ongoing scrutiny from customers, partners, and regulators who question their security practices. Stakeholders seek accountability for data breaches and operational failures, leading to legal liabilities. ... KNP joins an estimated 19,000 UK businesses that suffered ransomware attacks last year, according to government surveys. High-profile victims have included major retailers like M&S, Co-op, and Harrods, demonstrating that no organization is too large or established to be targeted.


Has the UK’s Cyber Essentials scheme failed?

There are several reasons why larger organisations may steer clear of CE in its current form, explains Kearns. “They typically operate complex, often geographically dispersed networks, where basic technical controls driven by CE do not satisfy organisational appetite to drive down risk and improve resilience,” she says. “The CE control set is also ‘absolute’ and does not allow for the use of compensating controls. Large complex environments, on the other hand, often operate legacy systems that require compensating controls to reduce risk, which prevents compliance with CE.” The point-in-time nature of assessment is also a poor fit for today’s dynamic IT infrastructure and threat environments, argues Pierre Noel, field CISO EMEA at security vendor Expel. ... “For large enterprises with complex IT environments, CE may not be comprehensive enough to address their specific security needs,” says Andy Kays, CEO of MSSP Socura. “Despite these limitations, it still serves a valuable purpose as a baseline, especially for supply chain assurance where larger companies want to ensure their smaller partners have a minimum level of security.” Richard Starnes is an experienced CISO and chair of the WCIT security panel. He agrees that large enterprises should require CE+ certification in their supplier contracts, where it makes sense. “This requirement should also include a contract flow-down to ensure that their suppliers’ downstream partners are also certified,” says Starnes.


Is Your Data Generating Value or Collecting Digital Dust?

Economic uncertainty is prompting many com­panies to think about how to do more with less. But what if they’re actually positioned to do more with more and just don’t realize it? Many organizations already have the resources they need to improve efficiency and resilience in challenging times. Close to two-thirds of organi­zations manage 1 petabyte or more of data, which represents enough data to cover 500 billion standard pages of text. More than 40% of companies store even more data. Much of that data sits unanalyzed while it incurs costs related to collection, compliance, and storage. It also poses data breach risks that require expensive security measures to prevent. ... Engaging with too many apps often makes employees less efficient than they could be. In 2024, companies used an average of 21 apps just for HR tasks. Multiply that across different functions, and it’s easy to see how finding ways to reduce the total could bring down costs. Trimming the number of apps can also increase productivity by reducing employee overwhelm. Constantly switching between different apps and systems has been shown to distract employees while increasing their levels of stress and frustration. Across the orga­nization, switching among tasks and apps consumes 9% of the average employee’s time at work by chipping away at their atten­tion and ability to focus a few seconds at a time with each of the hundreds of tasks switches they perform every day.


The history and future of software development

For any significant piece of software back then, you needed stacks of punch cards. Yes, 1000 lines of code needed 1000 cards. And you needed to have them in order. Now, imagine dropping that stack of 1000 cards! It would take me ages to get them back in order. Devs back then experienced this a lot—so some of them went ahead and had creative ways of indicating the order of these cards. ... y the mid 1970s affordable home computers were starting to become a reality. Instead of a computer just being a work thing, hobbyists started using computers for personal things—maybe we can call these, I don't know...personal computers. ... Assembler and assembly tend to be used interchangeably. But are in reality two different things. Assembly would be the actual language, syntax—instructions being used and would be tightly coupled to the architecture. While the assembler is the piece of software that assembles your assembly code into machine code—the thing your computer knows how to execute. ... What about writing the software? Did they use git back then? No, git only came out in 2005, so back then software version control was quite the manual effort. From developers having their own way of managing source code locally to even having wall charts where developers can "claim" ownership of certain source code files. For those that were able to work on a shared (multi-user) system, or have an early version of some networked storage—Source code sharing was as easy as handing out floppy disks.


Why the operating system is no longer just plumbing

Many enterprises still think of the operating system as a “static” or background layer that doesn’t need active evolution. The reality is that modern operating systems like Red Hat Enterprise Linux (RHEL) are dynamic, intelligent platforms that actively enable and optimize everything running on top of them. Whether you're training AI models, deploying cloud-native applications, or managing edge devices, the OS is making thousands of critical decisions every second about resource allocation, security enforcement, and performance optimization. ... With image mode deployments, zero-downtime updates, and optimized container support, RHEL ensures that even resource-constrained environments can maintain enterprise-grade reliability. We’ve also focused heavily on security—confidential computing, quantum-resistant cryptography, and compliance automation—because edge environments are often exposed to greater risk. These choices allow RHEL to deliver resilience in conditions where compute power, space, and connectivity are limited. ... We don't just take community code and ship it — we validate, harden, and test everything extensively. Red Hat bridges this gap by being an active contributor upstream while serving as an enterprise-grade curator downstream. Our ecosystem partnerships ensure that when new technologies emerge, they work reliably with RHEL from day one.


Ransomware now targeting backups, warns Google’s APAC security chief

Backups often contain sensitive data such as personal information, intellectual property, and financial records. Pereira warned that attackers can use this data as extra leverage or sell it on the dark web. The shift in focus to backup systems underscores how ransomware has become less about disruption and more about business pressure. If an organisation cannot restore its systems independently, it has little choice but to consider paying a ransom. ... Another troubling trend is “cloud-native extortion,” where attackers abuse built-in cloud features, such as encryption or storage snapshots, to hold systems hostage. Pereira explained that many organisations in the region are adapting by shifting to identity-focused security models. “Cloud environments have become the new perimeter, and attackers have been weaponising cloud-native tools,” he said. “We now need to enforce strict cloud security hygiene, such as robust MFA, least privilege access, proactively monitoring of role access changes or credential leaks, using automation to detect and remediate misconfigurations, and anomaly detection tools for cloud activities.” He pointed to rising investments in identity and access management tools, with organisations recognising their role in cutting down the risk of identity-based attacks. For APAC businesses, this means moving away from legacy perimeter defences and embracing cloud-native safeguards that assume breaches are inevitable but limit the damage.


AI Won't Replace Developers, It Will Make the Best Ones Indispensable

The replacement theory assumes AI can work independently, but it can't. Today's AI coding tools don't run themselves, they need active steering. Most AI tools today operate on a "prompt and pray" model: give the AI instructions, get code back, hope it works. That's fine for demos or side projects, but production environments are far less forgiving. ... AI doesn't level the playing field between developers, it widens it. Using AI effectively requires the same skills that make great developers great: understanding system architecture, recognizing security implications, writing maintainable code. ... Tomorrow's junior developers will need to get productive in a different way. Instead of spending months learning basic syntax and patterns, they'll start by learning to collaborate with AI agents effectively. Those who can adapt will find opportunities, and those who can't might struggle to break in. This shift actually creates more demand for senior engineers, because someone needs to train these AI-assisted junior developers, architect systems that can handle AI-generated code at scale, and establish the processes and standards that keep AI tools from creating chaos. ... The teams succeeding with AI coding treat agents like exceptionally capable junior teammates who need oversight. They provide detailed context, review generated code, and test thoroughly before deployment rather than optimizing purely for speed.

Daily Tech Digest - July 04, 2021

The importance of Robotic Process Automation

Michael believes that RPA will grow in importance in the future for a number of reasons. Firstly, understanding. It’s no longer an unknown technology. So many large organizations have Digital Workforces and so the worry and uncertainty around them have gone. Secondly, there is a real drive to add ‘Intelligent’ ahead of ‘Automation’. Whilst we aren’t quite at the widespread adoption of ‘intelligent Automation’ just yet, these cognitive elements are getting better and more available each week. Once we have more use cases then we will see the early adopters of RPA start to take the next step and begin to ‘add the human back into the robot’. Thirdly – the net cost of RPA is decreasing. There are now community versions available free of charge, additional software given as part of the platforms, and training available for free. The barriers to entry are disappearing Furthermore, Mahesh highlights that the global pandemic and the economic crisis has put a lot of organizations in a state of flux, made them change business processes, and has also highlighted the need for more automation through RPA.


How AI Is Changing The Real Estate Landscape

AI has applications in estimating the market value of properties and predicting their future price trajectory. For example, ML algorithms combine current market data and public information such as mobility metrics, crime rates, schools, and buying trends to arrive at the best pricing strategy. The AI uses a regression algorithm– accounting for property features such as size, number of rooms, property age, home quality characteristics, and macroeconomic demographics–to calculate the best price range. To wit, the AI algorithms can predict the prices based on the geographic location or future development. Online real estate marketplace Zillow puts out home valuations for 104 million homes across the US. The company, founded by former Microsoft executives, uses cutting edge statistical and machine learning models to vet hundreds of data points for individual homes. Zillow employs a neural network-based model to extract insights from huge swathes of data and tax assessor records and direct feeds from hundreds of multiple listing services and brokerages.


Quantum Computing just got desktop sized

Quantum computing is coming on leaps and bounds. Now there’s an operating system available on a chip thanks to a Cambridge University-led consortia with a vision is make quantum computers as transparent and well known as RaspberryPi. This “sensational breakthrough” is likened by the Cambridge Independent Press to the moment during the 1960s when computers shrunk from being room-sized to being sat on top of a desk. Around 50 quantum computers have been built to date, and they all use different software – there is no quantum equivalent of Windows, IOS or Linux. The new project will deliver an OS that allows the same quantum software to run on different types of quantum computing hardware. The system, Deltaflow.OS (full name Deltaflow-on-ARTIQ) has been designed by Cambridge Uni startup Riverlane. It runs on a chip developed by consortium member SEEQC using a fraction of the space necessary in previous hardware. SEEQC is headquartered in the US with a major R&D site in the UK. “In its most simple terms, we have put something that once filled a room onto a chip the size of a coin, and it works,” said Dr. Matthew Hutchings.


This Week in Programming: GitHub Copilot, Copyright Infringement and Open Source Licensing

On the idea of copyright infringement, Guadamuz first points to a research paper by Alber Ziegler published by GitHub, which looks at situations where Copilot reproduces exact texts, and finds those instances to be exceedingly rare. In the original paper, Ziegler notes that “when a suggestion contains snippets copied from the training set, the UI should simply tell you where it’s quoted from,” as a solution against infringement claims. On the idea of the GPL license and “derivative” works, Guadamuz again disagrees, arguing that the issue at hand comes down to how the GPL defines modified works, and that “derivation, modification, or adaptation (depending on your jurisdiction) has a specific meaning within the law and the license.” “You only need to comply with the license if you modify the work, and this is done only if your code is based on the original to the extent that it would require a copyright permission, otherwise it would not require a license,” writes Guadamuz. “As I have explained, I find it extremely unlikely that similar code copied in this manner would meet the threshold of copyright infringement, there is not enough code copied...”


Django Vs Express: The Key Differences To Observe in 2021

Django is an Python framework that provides rapid development. It has a pragmatic and clean design. It is recognized for having a ‘batteries included’ viewpoint, hence it is ready to be utilized. Here are some of the vital features of Django: Django takes care of content management, user authentication, site maps, and RSS feeds effectively; Extremely fast: This framework was planned to aid programmers to take web applications from the initial conception to project completion as rapidly as possible. ...  Express.js is a flexible and minimal Node.js web app framework that supplies a vigorous set of traits for mobile and web-based apps. With innumerable HTTP utility approaches and middleware at disposal, making a dynamic API is easy and quick. Numerous popular web frameworks are constructed on this framework. Below are some of the noteworthy features of Express.js: Middleware is a fragment of the platform that has access to the client request, database, and other such middlewares. It is primarily accountable for the organized organization of dissimilar functions of this framework; Express.js supplies several commonly utilized traits of Node.js in the kind of functions that can be freely employed anywhere in the package.


Unleashing the Power of MLOps and DataOps in Data Science

Data is overwhelming, and so is the science of mining, analyzing, and delivering it for real-time consumption. No matter how much data is good for business, it is still vulnerable to putting the privacy of millions of users at unimaginable risk. That is exactly why there is a sudden inclination towards more automated processes. In the past year, enterprises sticking to conventional analytics have realized that they will not survive any longer without a makeover. For example, enterprises are experimenting with micro-databases, each storing master data for a particular business entity only. There is also an increase in the adoption of self-servicing practices to discover, cleanse, and prepare data. They have understood the importance of embracing the ‘XOps’ mindset and delegate more important roles to MLOps and DataOps practices. Now, MLOps are important because bringing ML models to execution is more difficult than training them or deploying them as APIs. The complication further worsens in the absence of governance tools. 


TrickBot Spruces Up Its Banking Trojan Module

TrickBot is a sophisticated (and common) modular threat known for stealing credentials and delivering a range of follow-on ransomware and other malware. But it started out as a pure-play banking trojan, harvesting online banking credentials by redirecting unsuspecting users to malicious copycat websites. According to researchers at Kryptos Logic Threat Intelligence, this functionality is carried out by TrickBot’s webinject module. When victim attempts to visit a target URL (like a banking site), the TrickBot webinject package performs either a static or dynamic web injection to achieve its goal, as researchers explained: “The static inject type causes the victim to be redirected to an attacker-controlled replica of the intended destination site, where credentials can then be harvested,” they said, in a Thursday posting. “The dynamic inject type transparently forwards the server response to the TrickBot command-and-control server (C2), where the source is then modified to contain malicious components before being returned to the victim as though it came from the legitimate site.”


How a college student founded a free and open source operating system

FreeDOS was a very popular project throughout the 1990s and into the early 2000s, but the community isn’t as big these days. But it’s great that we are still an engaged and active group. If you look at the news items on our website, you’ll see we post updates on a fairly regular basis. It’s hard to estimate the size of the community. I’d say we have a few dozen members who are very active. And we have a few dozen others who reappear occasionally to post new versions of their programs. I think to maintain an active community that’s still working on an open source DOS from 1994 is a great sign. Some members have been with us from the very beginning, and I’m really thankful to count them as friends. We do video hangouts on a semi-regular basis. It’s great to finally “meet” the folks I’ve only exchanged emails with over the years. It's meetings like this when I remember open source is more than just writing code; it's about a community. And while I've always done well with our virtual community that communicates via email, I really appreciated getting to talk to people without the asynchronous delay or artificial filter of email—making that real-time connection means a lot to me.


Let Google Cloud’s predictive services autoscale your infrastructure

Predictive autoscaling uses your instance group’s CPU history to forecast future load and calculate how many VMs are needed to meet your target CPU utilization. Our machine learning adjusts the forecast based on recurring load patterns for each MIG. You can specify how far in advance you want autoscaler to create new VMs by configuring the application initialization period. For example, if your app takes 5 minutes to initialize, autoscaler will create new instances 5 minutes ahead of the anticipated load increase. This allows you to keep your CPU utilization within the target and keep your application responsive even when there’s high growth in demand. Many of our customers have different capacity needs during different times of the day or different days of the week. Our forecasting model understands weekly and daily patterns to cover for these differences. For example, if your app usually needs less capacity on the weekend our forecast will capture that. Or, if you have higher capacity needs during working hours, we also have you covered.
Why should you try it?


The IoT Cloud Market

Cloud computing and the Internet of Things (IoT) have become inseparable when one or the other is discussed and with good reason: You really can’t have IoT without the cloud. The cloud, a grander idea that stands on its own, is nonetheless integral to the IoT platform’s success. The Internet of Things is a system of unrelated computing devices, mechanical and digital machines, objects, and other devices provided with unique identifiers (an IP address) and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction. Whereas the traditional internet consists of clients – PCs, tablets, and smartphones, primarily – the Internet of Things could be cars, street signs, refrigerators, or watches. Whereas traditional Internet input and interaction relies on human input, IoT is almost totally automated. Because the bulk of IoT devices are not in traditional data centers and almost all are connected wirelessly, they are reliant on the cloud for connectivity. For example, connected cars that send up terabytes of telemetry aren’t always going to be near a data center to transmit their data, so they need cloud connectivity.



Quote for the day:

"Strong convictions precede great actions." -- James Freeman Clarke

Daily Tech Digest - January 31, 2019

Singapore releases guidelines for deployment of autonomous vehicles

Permanent Secretary for Transport and chairman of the Committee on Autonomous Road Transport for Singapore, Loh Ngai Seng, said plans were underway to launch a pilot deployment of autonomous vehicles in Punggol, Tengah, and Jurong Innovation District in the early 2020s, and TR 68 would help guide industry players in "the safe and effective deployment" of such vehicles in the city-state.  Enterprise Singapore's director-general of quality and excellence group Choy Sauw Kook said: "In addition to safety, TR 68 provides a strong foundation that will ensure interoperability of data and cybersecurity that are necessary for the deployment of autonomous vehicles in an urban environment. The TR 68 will also help to build up the autonomous vehicle ecosystem, including startups and SMEs (small and midsize enterprises) as well as testing, inspection, and certification service providers."


Network programmability in 5G: an invisible goldmine for service providers and industry

5G network programmability value chain
5G promises many disruptive functionalities, such as ultra-low latency communication, high bandwidth/throughput, higher security, and network slicing, all of which embed the potential to address new business opportunities not addressed by service providers today. But another functionality not always mentioned--and that has equal business potential--is network exposure, which can enable new levels of programmability in telecom core networks. Programmability in 5G Core networks allows providers to open up telecom network capabilities and services to third-party developers allowing them to create new use cases that don’t exist today. This is possible thanks to standardized APIs on the new network architecture for 5G. With APIs, a new frontier for business innovation in telecom will surge. Application developer partners will focus on new services applications, while telco service providers will focus on a new dimension of experience called “developer experience” and increase its position in the OTT value chain.


Internet Of Things (IoT): 5 Essential Ways Every Company Should Use It

Internet Of Things (IoT): 5 Essential Ways Every Company Should Use It
Strategic decision making is where the senior leadership team identifies the critical questions it needs answering. Operational decision-making is where data and analytics are made available to everyone in the organization, often via a self-service tool, to inform data-driven decision at all levels. More and more companies make IoT-enabled products which connect them directly to their customers’ behaviours and preferences. For example, Fitbit knows how much we all exercise and what our normal sleeping patterns are. Samsung can collect usage data from their smart TVs. Elevator manufacturer Kone learns how their customers are using their elevators and Rolls Royce knows how airlines use the jet engines they make. Even companies that don’t make IoT devices can often gain access to data from other people’s devices, just think app makers that are able to collect user data because of the data collection and connectivity capabilities of the smart phones or tablets that run them. Used correctly, companies can leverage these insights to make quicker and better business decisions.


No-deal Brexit could lead to data issues, MPs told


The no-deal Brexit planning notice warns that the legal framework for transferring personal data from organisations in the EU to organisations in the UK would have to change when the country leaves the EU.  This means that although businesses will be able to continue to send personal data from the UK to the EU, and would “at the point of exit continue to allow the free flow of personal data from the UK to the EU”, it may not be the same for the other way around. “We’ve been saying for a while that we would like the adequacy discussions to start as soon as possible. But the EU, as with everything else, is saying they won’t start the discussions until we are a third country. So, I’d be surprised if a decision could be made in under a year,” Derrington told the committee. There are also issues relating to legacy data, which was transferred from the EU to the UK before Brexit.


DARPA explores new computer architectures to fix security between systems

DARPA explores new computer architectures to fix security between systems
A better solution, then, in today's environment is to accept that users need or want to share data and to figure out how to keep the important bits more private, particularly as the data crosses networks and systems, with all having varying levels of, and types of, security implementations and ownership. The GAPS thrust will be in isolating the sensitive “high-risk” transactions and providing what the group calls “physically provable guarantees” or assurances. A new cross-network architecture, tracking, and data security will be developed that creates “protections that can be physically enforced at system runtime.” How they intend to do that is still to be decided. Radical forms of VPNs — an encrypted pipe through the internet would be today’s attempted solution. Whichever method they choose will be part of a $1.5 billion, five-year investment in government and defense electronics systems. And enterprise and the consumer may benefit. “As cloud systems proliferate, most people still have some information that they want to physically track, not just entrust to the ether,” says Walter Weiss, DARPA program manager, in the release.


There's more to WSL than Ubuntu

By integrating WSL with the updated Windows command-line environment, it's possible to integrate it directly with any application that offers a terminal. You can write code in Visual Studio Code, save it directly to a Linux filesystem, and test it from the built-in terminal, all without leaving your PC. And when it's time to deploy to a build system, you don't need to worry about line-ending formats or having to test code on separate systems. Support for SSH also ensures that you've got secure remote access to any Linux servers, in your data center or in the cloud. If you're using WSL to develop and test server applications, then you'll probably want to install SuSE Enterprise Server. It's a popular Linux server, and can be configured to handle most server tasks. With WSL now supported on Windows Server, you can use it to build test environments for cloud applications before deploying them on Azure or another public cloud. SuSE bundles a one-year developer subscription, which gives you more support resources than its standard community-based support forums.


Why we need less people, more skills for digital transformation

Why we need less people, more skill image
The fundamental argument comes down to value. Often in business, a corporate mentality exists in which executives boast about the number of people they have working for their company or on a project because they believe that provides the best value for their clients. This attitude has existed for more than two decades yet companies are still failing to understand that this might not provide the best value for their business or clients. Companies need to do more research to understand what works for them as an individual business, and often this means they don’t need to hire lots of people. Rather, they need the right people. While it may seem reassuring to have a large team working on an expensive project, often the work is easier, smoother and quicker when led by a small team who are highly-skilled, have good experience and who can be there working on the ground together, not spread around or working remotely. This may be more expensive at first, but it is worth it in the long term.



The FTC's cyberinsurance tips: A must-read for small business owners

cyberinsurance.jpg
Dan Smith, president, co-founder, and COO of Zeguro, a cybersecurity company that has grabbed the attention of investors, admits in this PYMNTS article the company came under a spear-phishing attack recently. It was unsuccessful, but it pointed out a very real need. Most small businesses do not think they need cyberinsurance (only 4% in the US currently have it) or do not know it's available. Smith adds that another problem area is that brokers providing the insurance are not spending enough time explaining it or may not understand it themselves.To fix the situation, Smith, in the PYMNTS article, announced that Zeguro will be partnering with the QBE Insurance Group to offer tailored cyberinsurance solutions. According to Smith, the idea is to use the company's expertise and acquired cybersecurity intelligence to craft the appropriate cyberinsurance solution for each client. Insurance on any level is a complicated subject, and then add the complexity of trying to secure a digital infrastructure from cybercriminals—using a partnership like Zeguro and QBE Insurance Group seems like good business.


What programming languages rule the Internet of Things?

What programming languages rule the Internet of Things?
Clearly, there’s a consensus set of top-tier IoT programming languages, but all of the top contenders have their own benefits and use cases. Java, the overall most popular IoT programming language, works in a wide variety of environments — from the backend to mobile apps — and dominates in gateways and in the cloud. C is generally considered the key programming language for embedded IoT devices, while C++ is the most common choice for more complex Linux implementations. Python, meanwhile, is well suited for data-intensive applications. Given the complexities, maybe IoT for All put it best. The site noted that, “While Java is the most used language for IoT development, JavaScript and Python are close on Java's heels for different subdomains of IoT development.” Perhaps, the most salient prediction, though, turns up all over the web: IoT development is multi-lingual, and it's likely to remain multi-lingual in the future.


How to accelerate digital identity in the UK


To encourage the reuse of a digital identity, the critical first step involves striking the right balance in the initial creation of a digital identity, based on the appropriate level of trust and friction for a first-time interaction. Digital services must be designed with the appropriate initial levels of trust, subsequently increasing levels of trust when required. It is a mistake to start with the maximum level of trust, which may be too high for the service. Instead, enhance trust as and when required. Digital identity standards allow services to map their increasing identity trust requirements effectively. Digital identity should be used at the point of need, with appropriate controls where absolutely necessary to complete the task. There is evidence that motivated users achieve high levels of success in verifying their identity in the right circumstances. The UK identity standards, built in response to real-world threats and risks, are world-leading, support the European Union’s eIDAS equivalence, and are closely aligned to the US NIST 800-63-A standard.



Quote for the day:


"Leading people is like cooking. Don_t stir too much; It annoys the ingredients_and spoils the food" -- Rick Julian


Daily Tech Digest - June 05, 2018

10 Open Source Security Tools You Should Know

(Image: Anemone123)
The people, products, technologies, and processes that keep businesses secure all come with a cost — sometimes quite hefty. That is just one of the reasons why so many security professionals spend at least some of their time working with open source security software. Indeed, whether for learning, experimenting, dealing with new or unique situations, or deploying on a production basis, security professionals have long looked at open source software as a valuable part of their toolkits.  However, as we all are aware, open source software does not map directly to free software; globally, open source software is a huge business. With companies of various sizes and types offering open source packages and bundles with support and customization, the argument for or against open source software often comes down to its capabilities and quality. For the tools in this slide show, software quality has been demonstrated by thousands of users who have downloaded and deployed them. The list is broken down, broadly, into categories of visibility, testing, forensics, and compliance. If you don't see your most valuable tool on the list, please add them in the comments.



The growing ties between networking roles and automation

Automation was expected to steal jobs and replace human intelligence. But as network automation use cases have matured, Kerravala said, employees and organizations increasingly see how automating menial network tasks can benefit productivity. To automate, however, network professionals need programming skills to determine the desired network output. They need to be able to tell the network what they want it to do. All of this brings me to an obvious term that's integral to automation and network programming: program, which means to input data into a machine to cause it to do a certain thing. Another definition says to program is "to provide a series of instructions." If someone wants to give effective instructions, a person must understand the purpose of the instructions being relayed. A person needs the foundation -- or the why of it all -- to get to the actual how. Regarding network automation, the why is to ultimately achieve network readiness for what the network needs to handle, whether that's new applications or more traffic, Cisco's Leary said.


5 ways location data is making the world a better place

A salesperson (L) talks with a visitor in front of a map showing the location of an apartment complex which is currently under construction at its showroom in Seoul March 18, 2015. While activity is soaring, with the number of transactions at a 7-year high, housing prices are rising at a glacial pace as heavy household debt and a fast-ageing population keep a lid on price growth. To match story SOUTHKOREA-ECONOMY/HOUSING Picture taken on March 18. REUTERS/Kim Hong-Ji
In the insurance sector, detailed data creates better predictions and more accurate customer quotes. Yet potential purchasers often don’t know the information needed for rigorous risk assessments, such as the distance of their house from water. Furthermore, lengthy and burdensome questionnaires can lose firms business; analysis from HubSpot found, by reducing form fields, customer conversions improve. PCA Predict uses its Location Intelligence platform to compile free data from the Land Registry and Ordinance Survey, including LiDAR height maps, as well as commercial address data, to determine accurate information on a potential customer’s property, such as distance from a river network, height, footprint, if the property is listed and its risk of wind damage. The model is also being developed to determine a building’s age using machine-learning and road layout. “We take disparate datasets and apply different types of analysis to extract easy-to-use attributes for insurers,” says Dr Ian Hopkinson, senior data scientist at GBG, the parent company of PCA.


Adoption of Augmented Analytics Tools Is Increasing Among Indian Organizations

Indian organizations are increasingly moving from traditional enterprise reporting to augmented analytics tools that accelerate data preparation and data cleansing, said Gartner, Inc. This change is set to positively impact the analytics and business intelligence (BI) software market in India in 2018. Gartner forecasts that analytics and BI software market revenue in India will reach US$304 million in 2018, an 18.1 percent increase year over year. ... "Indian organizations are shifting from traditional, tactical and tool-centric data and analytics projects to strategic, modern and architecture-centric data and analytics programs," said Ehtisham Zaidi, principal research analyst at Gartner. "The 'fast followers' are even looking to make heavy investments in advanced analytics solutions driven by artificial intelligence and machine learning, to reduce the time to market and accuracy of analytics offerings."


Apple’s Core ML 2 vs. Google’s ML Kit: What’s the difference?

core ml 2
A major difference between ML Kit and Core ML is support for both on-device and cloud APIs. Unlike Core ML, which can’t natively deploy models that require internet access, ML Kit leverages the power of Google Cloud Platform’s machine learning technology for “enhanced” accuracy. Google’s on-device image labeling service, for example, features about 400 labels, while the cloud-based version has more than 10,000. ML Kit offers a couple of easy-to-use APIs for basic use cases: text recognition, face detection, barcode scanning, image labeling, and landmark recognition. Google says that new APIs, including a smart reply API that supports in-app contextual messaging replies and an enhanced face detection API with high-density face contours, will arrive in late 2018. ML Kit doesn’t restrict developers to prebuilt machine learning models. Custom models trained with TensorFlow Lite, Google’s lightweight offline machine learning framework for mobile devices, can be deployed with ML Kit via the Firebase console, which serves them dynamically.


How to evaluate web authentication methods

user authentication
Two attributes I hadn’t give a lot of thought to are “requiring explicit consent” and “resilient to leaks from other verifiers.” The former ensures that a user’s authentication is not initiated without them knowing about it, and the latter is about preventing related authentication secrets from being used to deduce the original authentication credential. The authors evaluate all the covered authentication solutions across all attributes, and they include a nice matrix chart so you can see how each compared to the other. It’s a genius table that should have been created a long time ago. The authors rate each authentication option as satisfying, not satisfying or partially satisfying each attribute. The attributes aren’t ranked, but anyone could easily take the unweighted framework, add or delete attributes, and weight it with their own needed importance. For example, many authentication evaluators looking for real-world solutions will want to add cost (both initial and ongoing) and vendor product solutions. The author’s candid conclusions include: “A clear result of our exercise is that no [authentication] scheme we examined is perfect – or even comes close to perfect scores.”


Advanced Architecture for ASP.NET Core Web API


Before we dig into the architecture of our ASP.NET Core Web API solution, I want to discuss what I believe is a singlebenefit which makes .NET Core developers lives so much better; that is, DependencyInjection (DI). Now, I know you will say that we had DI in .NET Framework and ASP.NET solutions. I will agree, butthe DI we used in the past would be from third-party commercial providers or maybe open source libraries. They did a good job, butfor a good portion of .NET developers, there was a big learning curve, andall DI libraries had their uniqueway of handling things. Today with .NET Core, we have DI built right into the framework from the start. Moreover,it is quite simple to work with, andyou get it out of the box. The reason we need to use DI in our API is that it allows usto have the best experience decoupling our architecture layers and also to allowus to mock the data layer, or have multiple data sources built for our API. To use the .NET Core DI framework, justmake sure your project references the Microsoft.AspNetCore.AllNuGet package (which contains a dependency on Microsoft.Extnesions.


Intuitively Understanding Convolutions for Deep Learning


The advent of powerful and versatile deep learning frameworks in recent years has made it possible to implement convolution layers into a deep learning model an extremely simple task, often achievable in a single line of code. However, understanding convolutions, especially for the first time can often feel a bit unnerving, with terms like kernels, filters, channels and so on all stacked onto each other. Yet, convolutions as a concept are fascinatingly powerful and highly extensible, and in this post, we’ll break down the mechanics of the convolution operation, step-by-step, relate it to the standard fully connected network, and explore just how they build up a strong visual hierarchy, making them powerful feature extractors for images. The 2D convolution is a fairly simple operation at heart: you start with a kernel, which is simply a small matrix of weights. This kernel “slides” over the 2D input data, performing an elementwise multiplication with the part of the input it is currently on, and then summing up the results into a single output pixel.


Windows Server 2019 embraces SDN

windows server 2019
The new virtual networking peering functionality in Windows Server 2019 allows enterprises to peer their own virtual networks in the same cloud region through the backbone network. This provides the ability for virtual networks to appear as a single network. Fundamental stretched networks have been around for years and have provided organizations the ability to put server, application and database nodes in different sites. However, the challenge has always been the IP addressing of the nodes in opposing sites. When there are only two static sites in a traditional wide area network, the IP scheme was relatively static. You knew the subnet and addressing of Site A and Site B. However, in the public cloud and multi-cloud world – where your target devices may actually shift between racks, cages, datacenters, regions or even hosting providers – having addresses that may change based on failover, maintenance, elasticity changes, or network changes creates a problem. Network administrators have already spent and will drastically increase the amount of time they spend addressing, readdressing, updating device tables, etc to keep up with the dynamic movement of systems.


Managing a hybrid cloud computing environment


Ensuring the security of physical edge networking connections and the connectivity of all communication is equally essential. This requires redundant networking components that utilize built-in failover capabilities. Finally, careful selection of the power infrastructure is vital to supporting all elements of edge computing. The ability to maintain power at all times via the use of backup power and integration of the remote monitoring of the power infrastructure into the customer’s management system are paramount. You can do this by seeking UPSs, rackmount power distribution units (PDUs) and power management software with remote capabilities. Being able to remotely reboot UPSs or PDUs can be extremely helpful in edge applications. In addition, solutions like Eaton’s Intelligent Power Manager software can enhance your disaster avoidance plan by allowing you to set power management alerts, configurations and action policies. By creating action policies for remediation, Eaton enables you to automate server power capping, load shedding and/or virtual machine migration should problems occur.



Quote for the day:


"Don't be buffaloed by experts and elites. Experts often possess more data than judgement." -- Colin Powell


Daily Tech Digest - February 06, 2018

Logistic Regression Using Python

The goal of a binary classification problem is to predict a class label, which can take one of two possible values, based on the values of two or more predictor variables (sometimes called features in machine language terminology). For example, you might want to predict the sex (male = 0, female = 1) of a person based on their age, annual income and height. There are many different ML techniques you can use for binary classification. Logistic regression is one of the most common. Logistic regression is best explained by example. ... This article explains how to implement logistic regression using Python. There are several machine learning libraries that have built-in logistic regression functions, but using a code library isn't always feasible for technical or legal reasons. Implementing logistic regression from scratch gives you full control over your system and gives you knowledge that can enable you to use library code more effectively.


How to focus on solutions, rather than playing the blame game between business and IT

istock-635719038.jpg
This perennial blame game follows us in our personal lives as well, especially after recent political events. Hop onto your favorite 24-hour cable or radio news show, and it seems the world is on the brink of destruction due to the opposing political party. Every ill in the world, and there are dozens of new ones every day, is the fault of the other side. Take a moment to flip up or down a few channels, and you'll find a station representing the other political party, equally outraged at a world teetering on the brink of destruction due to the actions of your party. ... Amplifying grievances, both real and imagined, has become a big business and surrounds us in our professional and personal lives. It's easy to see why: when you can blame all your professional and personal problems on another party, one that you're unable to change or impact, you're absolved of any accountability or control over your own destiny.


Here's What Happens When Your Mom Or Dad Steals Your Identity

KJ Barnaby Jr. and his mom Trina Patterson.
Minors are attractive targets for identity theft. Because of their young age, they have clean credit reports and often don't discover the theft until they reach adulthood and apply for credit, John Krebs, identity theft program manager with the Federal Trade Commission, told BuzzFeed News. And their social security number and other personal information is easily available to family members — so easily available that there are cases of parents secretly using their adult children's information to open lines of credit. Hailee, a 23-year-old community college student in Pennsylvania, told BuzzFeed News she is working off $500 in debt on a credit card she didn't know existed until recently. Her mother opened the account in her name in 2015 and used it to replace a broken air conditioner. Hailee said she didn't discover the account until Wells Fargo began pestering her about late payments.


Data Science is Changing and Data Scientists will Need to Change Too


There’s a sea change underway in data science. It’s changing how companies embrace data science and it’s changing the way data scientists do their job. The increasing adoption and strategic importance of advanced analytics of all types is the backdrop. There are two parts to this change.  One is what is happening right now as analytic platforms build out to become one-stop shops for data scientists. But the second and more important is what is just beginning but will now take over rapidly. Advanced analytics will become the hidden layer of Systems of Intelligence (SOI) in the new enterprise applications stack.  Both these movements are changing the way data scientists need to do their jobs and how we create value. Advanced analytic platforms are undergoing several evolutionary steps at once. This is the final buildout in the current competitive strategy being used by advanced analytic platforms to capture as many data science users as possible.


Why Linux is better than Windows or macOS for security

Linux, macos and Windows security locks up data
The OS you deploy to your users does make a difference for your security stance, but it isn’t a sure safeguard. For one thing, a breach these days is more likely to come about because an attacker probed your users, not your systems. A survey of hackers who attended a recent DEFCON conference revealed that “84 percent use social engineering as part of their attack strategy.” Deploying a secure operating system is an important starting point, but without user education, strong firewalls and constant vigilance, even the most secure networks can be invaded. And of course there’s always the risk of user-downloaded software, extensions, utilities, plug-ins and other software that appears benign but becomes a path for malware to appear on the system. And no matter which platform you choose, one of the best ways to keep your system secure is to ensure that you apply software updates promptly.

APIs Pose 'Mushrooming' Security Risk

"APIs represent a mushrooming security risk because they expose multiple avenues for hackers to try to access a company's data," explains Terry Ray, CTO of Imperva. "To close the door on security risks and protect their customers, companies need to treat APIs with the same level of protection that they provide for their business-critical web applications.” Nevertheless, APIs remain greatly important for business and IT strategy. "The greatest revenue potential (APIs) provide is removing barriers to growing revenue by integrating platforms and apps so organizations can quickly launch new business models and scale fast," explains Louis Columbus, an enterprise software strategist and principal at IQMS, a manufacturing ERP vendor, in a Forbes piece last year. What's more, APIs are also fueling new methods of developing and deploying software. As organizations seek means to deliver and tweak software faster, they're increasingly breaking up large monolithic code bases into smaller chunks of independent code called microservices.


What is the Industrial IoT? And why the stakes are so high

industrial iot
The industrial internet of things is also referred to as the industrial internet, a term coined by GE, and Internet of Industrial Things. Whatever you call it, the IIoT is different from other IoT applications in that it focuses on connecting machines and devices in industries such as oil and gas, power utilities and healthcare. IoT includes consumer-level devices such as fitness bands or smart appliances and other applications that don’t typically create emergency situations if something goes wrong. Simply stated, there is more at stake with IIoT deployments where system failures and downtime can result in life-threatening or high-risk situations. The IIoT brings computers from IT to operational technology, opening up vast possibilities for instrumentation, leading to major efficiency and productivity gains for almost any industrial operation.


Capacity alone won't assure good cloud performance

Capacity alone won't assure good cloud performance
Truth be told, performance testing is often an afterthought that typically comes up only when there is a performance problem that the users see and report. Moreover, performance usually becomes an issue when the user loads surpass a certain level, which can be anywhere from 5,000 to 100,0000 concurrent sessions, depending on the application. So you discover a problem only when you’re got high usage. At which point you can’t escape the blame. An emerging best practice is to build in performance testing into your devops or cloud migration process. This means adding performance tests to the testing mix and look at how the application workload and connected database deals with loads well beyond what you would expect.  This means looking for a performance testing tool that is compatible with your application, the other devops tools you have, and the target cloud platform where the application is to be deployed. 


securitylock.jpg
Since threat rigidity occurs when FUD is spread, Weeks suggests that a successful cybersecurity professional will carefully communicate to management how prior incidents were handled and convey new ideas on how to eliminate the current threat. "Any message to a group must contain the minimal amount of critical information needed to support the reaction to a threat," explains Weeks. "Not only evaluating all data points, messaging also carefully considers how the recipient perceived those data points. Knowing an audience and what preconceived ideas and hypothesis they may bring is central to proper communications, especially in a threat-response scenario." ... "Ensuring an organization is confident that a cybersecurity professional is managing a response is arguably just as important, if not more so, than implementing a technical control," Weeks writes, adding it is the only way a cybersecurity professional can maintain his or her credibility.


Using blockchain to solve IoT security challenges

In effect, a “permissioned and private” blockchain could be used to safely on-board IoT and other connected devices, registering them in a private blockchain ledger. New devices attempting to access the network would have to be approved, and found to follow the same security policies to be verified and granted access to the chain – thereby eliminating the possibility for “zombie devices” like the ones that carried out the Dyn DDoS attack. Through this model, IoT devices can communicate with like-IoT devices to determine if the “newbie” is up to par on its security settings, making sure that it only has access to data that authorized IoT devices have permissions for, and that it isn’t siloing data or acting as a ‘thingbot’. For instance, if an employee wants to connect their Fitbit while at work, all they need to do is connect it with another IoT device, which would let the Fitbit know what it needs to do in order to be considered secure enough to receive a connection.



Quote for the day:


"You must have long term goals to keep you from being frustrated by short term failures." -- Charles C. Noble


Daily Tech Digest - December 15, 2017

Digital Disruption: 10 Ways To Survive & Thrive

Digital disruption: 10 ways to survive and thrive
Some CEOs are embarking on vision quests to help navigate digital disruption, which is marked by a shift in profitability from one prevailing business model to another. Puthiyamadam, who leads the PwC's digital services practice and oversees its experience center, recalls one recent conversation with a CEO client who attended a "digital bootcamp" in Europe. The CEO was told he must join Twitter and that his business would be disrupted in two years. Puthiyamadam quickly assured the CEO that the threats weren’t so imminent. Indeed, he regularly cautions clients against acting rashly because the wrong bets, from service ideation to technology choices, can set a business back years. "Don't believe you need to act frantically and in panic mode because your business is going to get completely overwhelmed," Puthiyamadam tells CIO.com.



DevOps in the public sector: Assessing the challenges and the benefits

“The public sector is often saddled with a significant burden of legacy systems which must be maintained and, where possible, modernised,” says Jason Rolles, CEO of software development monitoring software supplier BlueOptima. This means making use of open source development tools, such as Git and Jenkins, but also having the right IT environment to reap the benefits of these DevOps tools. It is inevitable that legacy systems will slow down a DevOps approach which is meant to bring an organisation both flexibility and speed. This shift away from incumbent providers and legacy infrastructure is to do with finance too. But, without the budget needed to move away from legacy technologies, recruiting DevOps personnel gets even harder, and this becomes a vicious cycle that encourages departments to remain the same.


5 tips for better NGINX security that any admin can handle

nginxhero.jpg
NGINX continues to rise in popularity. According to the October, 2017 Netcraft stats, it has nearly caught up with Apache—meaning more and more people are making use of this lightweight, lightning fast web server. That also means more and more NGINX deployments need to be secured. To that end, there are so many possibilities. If you're new to NGINX, you will want to make sure to deploy the server in such a way that your foundation is safe. I will walk through five ways to gain better security over NGINX that won't put your skills or resolve to too much of a test. ... It is possible to limit the rate NGINX will accept incoming requests. For example, say you want to limit the acceptance of incoming requests to the /wp-admin section. To achieve this, we are going to use the limit_req_zone directory and configure a shared memory zone named one and limit it to 30 requests per minute.


Cloud computing: Getting bigger but more complicated too

art-hybrid-cloud-intro-2017.jpg
The location of the company offering a cloud service is something that has come under particular scrutiny recently. For example, the UK government's National Cyber Security Centre (NCSC) warned about the use of some cloud-based antivirus products from Russian companies, but also warned more broadly about the use of cloud services within the government supply chain. "The country of origin matters. It isn't everything, and nor is it a simple matter of flags -- there are Western companies who have non-Western contributors to their supply chain, including from hostile states. But in the national security space there are some obvious risks around foreign ownership," NCSC CEO Ciaran Martin wrote in a letter to civil service chiefs. The NCSC noted that government departments might not even be aware they are using cloud-based services: "It's easy to overlook the nature of these cloud interactions, and the security implications. 


Employers And Employees Need To Step Up On Cybersecurity

Even with the clear need for IT and network security experts, kununu found that job security ranked lowest for employees. Due to management changes or layoffs and the lack of a clear plan in place, internal organization was at an all-time low. This was leading to bad morale and disaffected employees can always be equated with company security vulnerability Within the reviews, employees even shared that their companies were not up to par in terms of the technology and were using antiquated kit, offering hackers a free pass into companies’ most sensitive data. Based in Vienna and leading the European market, kununu launched in the US last year in a joint venture with Monster and has already collected more than half a million reviews on its website. Its reviews are broken down into 18 key dimensions of workplace satisfaction to provide job seekers with workplace insights that matter in order to to make sound work-life decisions


Could blockchains rattle ECM?

Blockchains are distributed, crowd-validated ledgers which use internet-connected computers and open source software all over the world to verify transactions. One of their major benefits in financial transactions is their immunity to tampering, thanks to the built-in consensus mechanism. In theory, this could also make blockchain a secure, verifiable and permanent solution for exchanges of any kind – for managing records, for instance. Sweden’s land registry authority is currently exploring blockchains’ potential as a mechanism for recording property deals. In this context, the blockchain would confirm and save each step in the contract process between buyers and sellers, while making each deal’s information transparent to all parties such as banks and local governments. But how far could this go, and what does it mean for ECM as we know it? To assess the potential and any limitations we must consider what sets blockchains’ approach apart.


Figure 1
Enterprises that wish to deliver disruptive innovation must understand their own strategy and objectives, their current operational environment and challenges, and their external environment. They can begin by identifying opportunity areas and key markets. Once a consensus is reached, they can identify priority market segments. This may lead to redefining market segments and segmentation criteria. At this point, they should analyze the industry structure—segment clients, suppliers, potential new entrants, substitution products—and then identify what makes each player powerful, using strategic tools. For example, “The Five Competitive Forces That Shape Strategy”9 shows that suppliers boasting strong concentration, high switching costs, genuine differentiation, unique intellectual property (IP) and strong value for clients will command higher prices than industry incumbents. 


20 Ways To Rekindle Your Passion For IT

20 ways to rekindle your passion for IT
In March 2017, Zucker left the financial services firm and launched a new career providing training and advisory services in project management, agile development and leadership. "The change has been wonderful," he declares. "I'm working harder than before, but I'm passionate and enthusiastic about what I am doing." Zucker is hardly the only IT leader to watch his early enthusiasm spill into a drain of frustration, boredom and ennui. A 2016 Stress and Pride survey, sponsored by IT talent management and solutions company TEK Systems, found that a sizeable number of senior-level IT professionals are dissatisfied with their jobs. In fact, 24 percent of respondents stated that while they were proud they had chosen IT as a career, they were not proud of their current role, assignments and responsibilities. Worse yet, a discouraging 16 percent agreed that if they had to do it all over again, they wouldn't go into IT.



An Effective Cyber Hygiene Program Can Save A Business


Most small businesses have overarching cybersecurity plans that establish antivirus programs, firewalls, and other defenses to thwart cyberattacks. However, rarely do these plans consider individual behavior, which is why more than half of all cyberattacks aim for American small businesses. In addition to these cybersecurity measures, businesses need to consider cyber hygiene. Cyber hygiene, also called security hygiene, is general behavior that keeps individuals safe from cyberattack. Unlike cybersecurity, which pertains to an organization’s largescale efforts, hygiene consists of an individual’s responsibilities and actions. For example, an IT department might build and monitor firewalls and intrusion detection systems, but if individual employees fail to generate strong passwords, install software updates, or run regular malware scans, then a business remains insecure.


BlueBorne Attack Highlights Flaws in Linux, IoT Security

Researchers at IoT security firm Armis earlier this year discovered Blueborne, a new group of airborne attacks. The vulnerabilities let attackers take full control of any device running Linux, or OS derived from Linux, putting the majority of IoT devices at risk of exposure. The researchers discussed and demonstrated their latest findings at Black Hat Europe 2017, held last week in London. Vulnerabilities in the Bluetooth stack have been overlooked for the past decade, they explained. Bluetooth, often perceived as peripheral, could benefit attackers if they successfully break into a high-privilege device. As the researchers demonstrated, one compromised product can spread its attack over the air to other devices within Bluetooth range. "These attacks don't require any user interaction or any authentication," said Armis head researcher Ben Seri in their presentation.



Quote for the day:


"The most common way people give up their power is by thinking they don't have any." -- Alice Walker