Showing posts with label opinion. Show all posts
Showing posts with label opinion. Show all posts

Daily Tech Digest - May 16, 2023

Law enforcement crackdowns and new techniques are forcing cybercriminals to pivot

Because of stepped-up law enforcement efforts, cybercriminals are also facing a crisis in cashing out their cryptocurrencies, with only a handful of laundering vehicles in place due to actions against crypto-mixers who help obfuscate the money trail. "Eventually, they'll have to cash out to pay for their office space in St. Petersburg to pay for their Lambos. So, they're going to need to find an exchange," Burns Coven said. Cybercriminals are just sitting on their money, like stuffing money under the mattress. "It's been a tumultuous two years for the threat actors," she said. "A lot of law enforcement takedowns, challenging operational environments, and harder to get funds. And we're seeing this sophisticated laundering technique called absolutely nothing doing, just sitting on it." Despite the rising number of challenges, "I don't think there's a mass exodus of threat actors from ransomware," Burns Coven tells CSO, saying they are shifting tactics rather than exiting the business altogether. 


5 IT management practices certain to kill IT productivity

Holding people accountable is root cause analysis predicated on the assumption that if something goes wrong it must be someone’s fault. It’s a flawed assumption because most often, when something goes wrong, it’s the result of bad systems and processes, not someone screwing up. When a manager holds someone accountable they’re really just blame-shifting. Managers are, after all, accountable for their organization’s systems and processes, aren’t they? Second problem: If you hold people accountable when something goes wrong, they’ll do their best to conceal the problem from you. And the longer nobody deals with a problem, the worse it gets. One more: If you hold people accountable whenever something doesn’t work, they’re unlikely to take any risks, because why would they? Why it’s a temptation: Finding someone to blame is, compared to serious root cause analysis, easy, and fixing the “problem” is, compared to improving systems and practices, child’s play. As someone once said, hard work pays off sometime in the indefinite future, but laziness pays off right now.


How AI ethics is coming to the fore with generative AI

The discussion of AI ethics often starts with a set of principles guiding the moral use of AI, which is then applied in responsible AI practices. The most common ethical principles include being human-centric and socially beneficial, being fair, offering explainability and transparency, being secure and safe, and showing accountability. ... “But it’s still about saving lives and while the model may not detect everything, especially the early stages of breast cancer, it’s a very important question,” Sicular says. “And because of its predictive nature, you will not have everyone answering the question in the same fashion. That makes it challenging because there’s no right or wrong answer.” ... “With generative AI, you will never be able to explain 10 trillion parameters, even if you have a perfectly transparent model,” Sicular says. “It’s a matter of AI governance and policy to decide what should be explainable or interpretable in critical paths. It’s not about generative AI per se; it's always been a question for the AI world and a long-standing problem.”


Design Patterns Are A Better Way To Collaborate On Your Design System

You probably don’t think of your own design activities as a “pattern-making” practice, but the idea has a lot of very useful overlap with the practice of making a design system. The trick is to collaborate with your team to find the design patterns in your own product design, the parts that repeat in different variations that you can reuse. Once you find them, they are a powerful tool for making design systems work with a team. ... All designers and developers can make their design system better and more effective by focusing on patterns first (instead of the elements), making sure that each is completely reusable and polished for any context in their product. “Pattern work can be a fully integrated part of both getting some immediate work done and maintaining a design system. ... This kind of design pattern activity can be a direct path for designers and developers to collaborate, to align the way things are designed with the way they are built, and vice-versa. For that purpose, a pattern does not have to be a polished design. It can be a rough outline or wireframe that designers and developers make together. It needs no special skills and can be started and iterated on by all. 


Digital Twin Technology: Revolutionizing Product Development

Digital twin technology accelerates product development while reducing time to market and improving product performance, Norton says. The ability to design and develop products using computer-aided design and advanced simulation techniques can also facilitate collaboration, enable data driven decision making, engineer a market advantage, and reduce design churn. “Furthermore, developing an integrated digital thread can enable digital twins across the product lifecycle, further improving product design and performance by utilizing feedback from manufacturing and the field.” Using digital twins and generative design upfront allows better informed product design, enabling teams to generate a variety of possible designs based on ranked requirements and then run simulations on their proposed design, Marshall says. “Leveraging digital twins during the product use-cycle allows them to get data from users in the field in order to get feedback for better development,” she adds. Digital twin investments should always be aimed at driving business value. 


DevEx, a New Metrics Framework From the Authors of SPACE

Organizations can improve developer experience by identifying the top points of friction that developers encounter, and then investing in improving areas that will increase the capacity or satisfaction of developers. For example, an organization can focus on reducing friction in development tools in order to allow developers to complete tasks more seamlessly. Even a small reduction in wasted time, when multiplied across an engineering organization, can have a greater impact on productivity than hiring additional engineers. ... The first task for organizations looking to improve their developer experience is to measure where friction exists across the three previously described dimensions. The authors recommend selecting topics within each dimension to measure, capturing both perceptual and workflow metrics for each topic, and also capturing KPIs to stay aligned with the intended higher-level outcomes. ... The DevEx framework provides a practical framework for understanding developer experience, while the accompanying measurement approaches systematically help guide improvement.
While there are many companies with altruistic intentions, the reality is that most organizations are beholden to stakeholders whose chief interests are profit and growth. If AI tools help achieve those objectives, some companies will undoubtedly be indifferent to their downstream consequences, negative or otherwise. Therefore, addressing corporate accountability around AI will likely start outside the industry in the form of regulation. Currently, corporate regulation is pretty straightforward. Discrimination, for instance, is unlawful and definable. We can make clean judgments about matters of discrimination because we understand the difference between male and female, or a person’s origin or disability. But AI presents a new wrinkle. How do you define these things in a world of virtual knowledge? How can you control it? Additionally, a serious evaluation of what a company is deploying is necessary. What kind of technology is being used? Is it critical to the public? How might it affect others? Consider airport security. 


Prepare for generative AI with experimentation and clear guidelines

Your first step should be deciding where to put generative AI to work in your company, both short-term and into the future. Boston Consulting Group (BCG) calls these your “golden” use cases — “things that bring true competitive advantage and create the largest impact” compared to using today’s tools — in a recent report. Gather your corporate brain trust to start exploring these scenarios. Look to your strategic vendor partners to see what they’re doing; many are planning to incorporate generative AI into software ranging from customer service to freight management. Some of these tools already exist, at least in beta form. Offer to help test these apps; it will help teach your teams about generative AI technology in a context they’re already familiar with. ... To help discern the applications that will benefit the most from generative AI in the next year or so, get the technology into the hands of key user departments, whether it’s marketing, customer support, sales, or engineering, and crowdsource some ideas. Give people time and the tools to start trying it out, to learn what it can do and what its limitations are. 


Cyberdefense will need AI capabilities to safeguard digital borders

Speaking at CSIT's twentieth anniversary celebrations, where he announced the launch of the training scheme, Teo said: "Malign actors are exploiting technology for their nefarious goals. The security picture has, therefore, evolved. Malicious actors are using very sophisticated technologies and tactics, whether to steal sensitive information or to take down critical infrastructure for political reasons or for profit. "Ransomware attacks globally are bringing down digital government services for extended periods of time. Corporations are not spared. Hackers continue to breach sophisticated systems and put up stolen personal data for sale, and classified information." Teo also said that deepfakes and bot farms are generating fake news to manipulate public opinion, with increasingly sophisticated content that blur the line between fact and fiction likely to emerge as generative AI tools, such as ChatGPT, mature and become widely available. "Threats like these reinforce our need to develop strong capabilities that will support our security agencies and keep Singapore safe," the minister said. 


Five key signs of a bad MSP relationship – and what to do about them

Red flags to look out for here include overly long and unnecessarily complicated contracts. These are often signs of MSPs making lofty promises, trying to tie you into a longer project, and pre-emptively trying to raise bureaucratic walls to make accessing the services you are entitled to more complex. The advice here is simple – don’t rush the contract signing. Instead, ensure that the draft contract is passed through the necessary channels, so that all stakeholders have complete oversight. Also, do not be tempted by outlandish promises; think pragmatically about what you want to achieve with your MSP relationship, and make sure the contract reflects your goals. If you’re already locked into a contract, consider renegotiating specific terms. ... If projects are moving behind schedule and issues are coming up regularly, this is a sign that your project lacks true project management leadership. Of course, both parties will need some time when the project starts to get processes running smoothly, but if you’re deep into a contract and still experiencing delays and setbacks, this is a sign that all is not well at your MSP. 



Quote for the day:

"The greatest thing is, at any moment, to be willing to give up who we are in order to become all that we can be." -- Max de Pree

Daily Tech Digest - October 29, 2022

7 ways to ruin your IT leadership reputation

Be mindful of the decisions you make. “One careless choice can ruin your reputation and your career,” warns Jim Durham, CIO of Solar Panels Network USA, a national solar panel installation company. “By being aware of the risks and taking responsibility for your actions, you can minimize the damage and learn from your mistakes,” he advises. A careless decision can be anything from selecting the wrong technology to mishandling sensitive data. “Not only are these actions career-destructive, but they can also have lasting negative effects on your enterprise,” Durham notes. CIOs are always pressured by management to make the right decision. It’s important to remember, however, that even the best strategies and intentions can sometimes lead to disastrous results. “If you’re unsure about a decision, it’s always better to err on the side of caution and consult with your team before making a final call,” Durham suggests. Failure is never an option, particularly major failures. “It shows that you’re not capable of handling important tasks,” Durham states. 


Cyber Skills Shortage is Caused by Analyst Burnout

Data shows skilled and experienced professionals are leaving the industry due to burnout and disillusionment. In the UK, the cybersecurity workforce reportedly shrank by 65,000 last year, and according to a recent study, one in three current cybersecurity professionals are planning to change professions. According to ISACA’s State of Cybersecurity 2022 report, the top reasons for cybersecurity professionals leaving include being recruited by other companies (59%), poor financial incentives (48%), limited promotion and development opportunities (47%), high levels of work-related stress (45%) and lack of management support (34%). When discussing the skills shortage, many, by default, think of businesses struggling to recruit for their internal cybersecurity vacancies. However, this is equally challenging for specialist providers of consulting and managed cybersecurity services. Businesses increasingly rely on third-party managed services, particularly mid-size organizations, where outsourcing to a Managed Security Service Provider (MSSP) represents a much more commercially viable solution with considerably less up-front investment.


Data privacy is expensive — here’s how to manage costs

“The true cost of data privacy, broadly, is their trust with their customers,” said Akbar Mohammed, lead data scientist, Fractal AI. “In this era of customers increasingly becoming tech-savvy, as soon as they realize that their data isn’t secure, the company will risk loss of trust from consumers. This eventually results in a lot of business disruption.” Almost all companies that need to collect data for their operations should have a data privacy infrastructure in place. Companies should also set up dedicated security and compliance teams surveying data and technology assets along with maintaining an aggressive threat detection policy. It’s imperative for companies today to have a data strategy and have policy and procedures governed by a data governance entity. “For large organizations, it’s best to have regular audits or assessments and get privacy-related certifications,” Mohammad said. “Lastly, train your people and make the entire organization aware of your activities, your policies.”


Architectural Patterns for Microservices With Kubernetes

Kubernetes provides many constructs and abstractions to support service and application Deployment. While applications differ, there are foundational concepts that help drive a well-defined microservices deployment strategy. Well-designed microservices deployment patterns play into an often-overlooked Kubernetes strength. Kubernetes is independent of runtime environments. Runtime environments include Kubernetes clusters running on cloud providers, in-house, bare metal, virtual machines, and developer workstations. When Kubernetes Deployments are designed properly, deploying to each of these and other environments is accomplished with the same exact configuration. In grasping the platform independence offered by Kubernetes, developing and testing the deployment of microservices can begin with the development team and evolve through to production. Each iteration contributes to the overall deployment pattern. A production deployment definition is no different than a developer's workstation configuration. 


High data quality key to reducing supply chain disruption

With so many obstacles to overcome, the supply chain needs a saviour – and many experts are pointing to big data to fill the role. Prince believes data will become more important in this new era. He says that after Brexit, “there is uniquely new importance placed on master data, given the customs and other regulatory impacts of moving goods between the two markets”. Also, the greater risks posed in global trade and the need to be resilient mean that the predictive capabilities of data could be crucial. ... The potential of big data is clear – but to get the best results, the data involved needs to be accurate. “Data quality takes on many forms, including accuracy, completeness, timeliness, precision, and granularity,” says Laney. He points out that most organisations don’t have n-tier visibility in their supply chain, which means they don’t understand what is happening beyond the first tier of suppliers in the chain. They may also have incomplete data on where items are in the supply chain or when disruptions will happen.


Privacy assembly in Istanbul calls for adaptation to new necessities

Explaining that new challenges and needs emerged with the development of artificial intelligence (AI) and the metaverse, Koç underlined that protecting personal data should be a requirement. "Unfortunately, we pay the price for the comfort and efficiency provided by technology in the age of data, with privacy," he said. KVKK Chair Faruk Bilir, for his part, said that since the foundation's membership to the assembly, Türkiye has given significant importance to international efforts in the field. KVKK leads initiatives for the protection and awareness of personal data and privacy in line with the laws and regulations adopted since 2016, he added. "The protection of individuals' privacy emerges as an unchanging fact of the changing world," Bilir said. The protection of privacy is an indicator of civilization, Bilir said underlining the importance of a human-oriented approach. Law and ethics are complementary elements to the human-oriented approach, he added. "Technology is indispensable for us, our privacy is our priority," Bilir said.


CISA Releases Performance Goals for Critical Infrastructure

Among the newly recommended measures are implementation of multifactor authentication, making sure to revoke the login credentials of former employees, disabling Microsoft Office macros and prohibiting the connection of unauthorized devices, perhaps by disabling AutoRun. The document also recommends that the operational technology side have a single leader responsible for cybersecurity and that OT and IT staff work to improve their relationship. Organizations should "sponsor at least one 'pizza party' or equivalent social gathering per year" to be attended by the two cybersecurity teams. DHS says it will actively solicit feedback about the goals in the coming months and has set up a GitHub discussions page. The department's next plan is to roll out cybersecurity goals tailored to each sector of critical infrastructure in conjunction with the agencies closest to each sector, such as the Environmental Protection Agency for water systems.


Will Twitter Sink or Swim Under Elon Musk's Direction?

Musk's accompanying "let that sink in" tweet could be, in terms of bang for the buck, the most groan-inducing dad joke of all time. But it shouldn't hide the very real business and security challenges facing Musk, who's already CEO of Tesla and SpaceX, as he takes the helm of a social network sporting 230 million customers. "The bird is freed," Musk tweeted late Thursday, before the $44 billion deal closed Friday, and Twitter filed for delisting from the New York Stock Exchange as it goes private. Like so much with Musk, commentators have been attempting to deduce his planned intentions on numerous fronts. Musk styles himself as a showman, having once tweeted - apparently about nothing in particular - that "the most entertaining outcome is the most likely." ... What state Twitter might be in once Musk is done with it remains unclear. Then again, when you're the richest person in the world, what's a few billion here or there, especially if it keeps people talking about you and guessing at your next move?


How to turbocharge collaboration in innovation ecosystems

Whether you call it socialization or use any other term, the human dimension of innovation is often overlooked or obscured. In part, this is because technology and the covid-19–induced migration to online platforms have garnered a great deal of attention. It’s important to remind managers that innovating as a special form of problem-solving is best tackled by empowering the workforce. Collaboration can be jump-started from many directions, but it can be only as vibrant as the company’s underlying culture of curiosity, learning, and continuous adaptation. In the Veezoo–AXA story, the formal process failed to reach a breakthrough. It was the involvement of specific individuals who were keen to see the collaboration through—often on their own terms—that led to success in building an innovation ecosystem. In fact, it is often through the behaviors and work of certain people that effective structure and discipline emerge across an ecosystem. 


Data Quality as the Centerpiece of Data Mesh

After all, data quality is always context-dependent and the domain teams will best know the business context of the data. From a data quality perspective, data mesh makes good sense as it allows data quality to be defined in a context-specific way–for example, the same data point can be considered “good” for one team but “bad” for another, depending on the context. As an example, let’s take a subscription price column with a fair amount of anomalies in it. Team A is working on cost optimization while Team B is working on churn prediction. As such, price anomalies will be more of an important data quality issue for Team B than for Team A. To make it easier to facilitate ownership between data products (which can be database tables, views, streams, CSV files, visualization dashboards, etc.), the data mesh framework suggests each product should have a Service Level Objective. This will act as a data contract, to establish and enforce the quality of the data it provides: timeliness, error rates, data types, etc.



Quote for the day:

"Humility is a great quality of leadership which derives respect and not just fear or hatred." -- Yousef Munayyer

Daily Tech Digest - May 10, 2020

Opinion: Responsible AI starts with higher education


“This new algorithm will need a lot of pictures of people. What if we use a morgue so we don’t have to worry about consent?” Although this is a fictitious example, modern-day tech workers often face similar questions. Why? Because the rise of artificial intelligence based on machine learning has created a new class of sociotechnical challenges. Now is the time for industry and universities to acknowledge these new challenges and step up to meet them. Since the beginning of the technology industry, educational institutions, legislatures, companies, and developers have worked to improve the quality of products and services. The resulting curricula, laws, corporate policies, standards, and development approaches have provided frameworks for engineers and product managers. Emerging technologies require the development of new frameworks. In the early 2000s, industry had to get serious about computer security. Today, we have a new challenge: How do you turn the goal of responsible AI into code?



How to help data scientists adapt to business culture


Businesses themselves don't understand what the data science discipline is, the work backgrounds from where data scientists are coming, and what it's going to take to acculturate these highly trained data engineers to how a business operates and what it needs. Many data scientists have lived their lives in environments funded by university grants that enabled them to pursue highly theoretical projects that are all about the quest for answers but not necessarily about finding definitive solutions for why customers seem to be suddenly favoring another brand, or why your manufactured products are suddenly experiencing more failures. Companies also struggle with integrating data scientists with their existing business and IT workforces. Often, existing business units and IT have little in common with data scientists, and there are no existing workflows that can help them learn how to optimally work together. Another issue is that businesses aren't always sure what (and when) to expect analytics and results out of their big data projects. Successful use cases exist in most industries, but companies still don't have a good feel for knowing when a data science or analytics project is moving forward and when it is stagnating.



20 ways banks can get AI right

AI
Try to create ‘segments of one’ through the collection of the volume and variety of data that can empower you to pursue automated hyper-personalisation. When clients feel that your service is sensitive and responsive to their individual preferences, they will be happy to share more and more information with you. Think about a situation where you offer the client the chance to offer money to a charity through ‘rounding prices up’. For example, the client purchases a coffee for £1.79 and you offer to have the remaining £0.21 put into a charity pot, which, after the client has collected £20, this pot can be given to a charity, which you, as a bank, will match with an equal donation. Let’s say the client is a paediatrician. In this case, the three potential charities the client can choose from should be about health, children, and medical research. Another client is a music teacher, in which case the three choices can be related to classical music, early talent, and education. These elements of hyper-personalisation have to be fully automated and ideally be propelled by some levels of AI.


Understanding the convergence of IoT and data analytics

Understanding the convergence of IoT and data analytics image
Simply collecting IoT data is not enough — “Organisations need to turn this data into value in both a batch (using traditional analytics) and real-time context. It is also not desirable, nor possible in some cases, to do all of your processing at the enterprise level (in the cloud or data centre, for example).” As is the nature of IoT devices, decisions will often need to be made in a localised fashion, including on the device itself, and these decisions will be largely driven by models derived from analytical processes and historical data. “The ability to make the edge ‘smarter’, offload compute workloads to the edge for more efficient processing, support localised or independent/disconnected processing, reduce decision latency, and reduce data transfer requirements are all benefits that may be applied to almost any vertical,” continues Petracek. “Analytics, and the operationalisation of analytical models and pipelines, presents a huge opportunity to organisations, especially given the level of real-time information and context that IoT can provide.”


2020 is about digital optimization not digital transformation


Digital optimization isn’t easy as a result of completely different groups are prone to have invested in options that don’t communicate to one another and aren’t straightforward to combine with one another. Further, every group may very well be going digital at a special tempo – these on the front-line coping with clients on daily basis are beneath extra stress than these working the group’s core operations. However, with out digital optimization, organizations will likely be unable to eliminate the silos in its processes even when groups embrace collaboration within the spirit of digital innovation. ... The vital factor to recollect when optimizing digital investments is that the group has one purpose, one mission, and one imaginative and prescient. Hence, the roadmap have to be comprised of straightforward milestones that have an effect, ideally on the enterprise-level. At the tip of the day, organizations should perceive the significance of optimizing their investments in digital, and prioritize it over spends that merely broaden their digital portfolio.


Managing Trade-offs: Prediction, Adaptability and Resilience


One critical new way of working that CEOs must “bottle” is organizational learning through local experimentation and global scaling. Lockdown has not only liberated the CEO, it has also freed local leaders from top-down governance. Often asking for forgiveness, rather than permission, they’ve innovated, disrupted and bullied their way to solutions that surmount obstacles and serve customers. In doing so, local teams have found support from the center. Some global leaders helped scale top solutions across the firm. They reimagined marketing and sales budgets overnight, showing the organization what costs are critical and what are dispensable. They solved huge supply chain issues, teaching the organization how to strengthen its operations. In order to ensure that this burst of experimentation and learning doesn’t become a historical oddity, leading CEOs will systematically protect the fundamental new relationship between global and local. They will set a clear agenda for the core business (or, as we like to call it, “Engine 1”): Continue the same pace of experimentation and learning throughout the long dance.


EY: revolutionising supply chain management with blockchain

blockchain
While in traditional supply chains production is recorded digitally, when it comes to shipping Brody explains that maintaining information continuity across systems and enterprise boundaries is a challenge, there is “oceans of digital data but only islands of useful information.” The use of systems such as electronic data interchange (EDI) and XML messaging are being utilised by these companies to try and maintain information continuity, but even these system pose their own challenges such as being out of sync and moving data only one stop down the supply chain, “The result: inventory that seems to be in two places at once,” added Brody. “These systems were created for an era of big, vertically integrated companies with large, but mostly static supply chains.” Although relevant 30 years ago, in today's modern supply chain this is not the case. ... “Until the advent of bitcoin and blockchain technology, the only way you could get a large number of entities to agree upon a shared, truthful set of data, such as who has what bank balance, was to appoint an impartial intermediary to process and account for all transactions,” highlighted Brody.


Microsoft is suddenly recommending Google products

microsoft-march-2020-patch-tuesday-fixes-5e6a4fb210393e000182bb8f-1-mar-16-2020-16-46-38-poster.jpg
Not merely extensions, but great extensions. I'm tempted to suspect a lawyer may have written that. Or at least someone in the Google marketing department. Naturally, I asked Microsoft why it had suddenly lurched from prickly to cuddly. Could it be that Google and Microsoft had a kiss-and-make-up Zoom call -- I mean, a Microsoft Teams call? Or a Google Meet encounter? Microsoft declined to comment. Perhaps, you might think, Microsoft has stopped to play nice merely because that's its brand image these days. Or perhaps some Redmonder stopped to think that, indeed, Edge doesn't currently enjoy enough of its own extensions. My delvings into Redmond's innards suggest the latter may have driven the decision even more than the former. You really don't want to annoy your customers, do you? Especially when you can't currently offer them what they need. Of course, Edge is based on Google's Chromium platform. In my own experimentations, I've found it to be a more pleasant experience than Chrome. Just that little bit more responsive and generally brighter -- though I can't quite cope with Bing as my default search engine.


Expanding Data Governance into the Future


Recognition that good Data Governance has become a must has come none too soon. Donna Burbank, Managing Director at Global Data Strategy, notes that many companies are beginning or planning to begin a Data Governance program, including a broader range of industries than before. However, spreading an existing successful Data Governance framework in one business area does not necessarily translate across the entire enterprise, or even to another company. Freddie Mac tried several times to implement DG driven by IT, and nothing stuck until a next-generation proactive and collaborative Data Governance took hold. Unfortunately, many companies, like Freddie Mac, get stuck in old patterns, trying to evangelize rigid Data Governance practices, gumming up operations, and fostering mistrust. Firms in this situation, according to Derek Steer, CEO at Mode, end up governing the wrong amount of data (missing the highest priority data assets) or enforcing Data Governance poorly (spending too much or too little time maintaining Data Governance logic). The first steps include understanding lessons from initial DG processes, how DG has changed, and how the next generation works better to support the business.


Amazon Faces A New Opponent: Some Of Its Own Tech Employees

U.S. employees of Amazon, its supermarket subsidiary Whole Foods and supermarket delivery services were called to strike on May 1, taking advantage of May 1 to denounce employers accused of not sufficiently protecting them in the face of the pandemic. (Photo by VALERIE MACON / AFP) (Photo by VALERIE MACON/AFP via Getty Images)
Tech employees are speaking out for their blue-collar counterparts partly because the warehouse workers asked them to. Costa, who had been at the company for 15 years before she was fired, says warehouse workers reached out in March to the Amazon Employees for Climate Justice (AECJ), an internal group she co-founded two years ago, for help and support during the pandemic. “Tech workers are ‘a valued resource,’” Costa says. “They [Amazon management] see us as less expendable than warehouse workers because they know they can’t just throw more bodies at our seats if we leave. We have more leverage, and that’s why tech workers have much more privilege and have that much more responsibility to speak out.” AECJ organized a one-hour video call in mid-April during which warehouse workers could speak to Amazon tech employees who were interested to hear from them directly. The invite was sent out via Amazon’s internal e-mail system on Friday, April 10. “It got 1,550 accepts on a Friday afternoon, when New York, Europe and India were already off the clock,” Costa said.



Quote for the day:


"Leadership without character is unthinkable - or should be." -- Warren Bennis


Daily Tech Digest - July 31, 2019

The Power Of Purpose: The ROI Of Purpose

The ROI of Purpose - Copyright Conspiracy of Love 2019 www.conspiracyoflove.co
The impact of a purpose-driven initiative on the health of the brand is also another key area to be measured. The ‘silver bullet’ question which is most important is drawing a clear correlation between purpose and sales. However given the complexity of the purchase funnel, I believe that at the very least measuring ‘Purchase Intent’ (‘Does this initiative make you more or less likely to purchase this brand’) is the closest proxy. ... Often, one of the biggest upsides of purpose-driven initiatives is the effect it has on the employees of the company in terms of morale and motivation - not to mention recruiting new talent, especially Millennials and Gen-Z who are increasingly motivated by the opportunity to work for a company that creates meaningful social and environmental impact (leading to lower recruiting costs). While each company has its own metrics for measuring employee engagement, a common one worth measuring is the impact on turnover: Does the initiative make employees more or less likely to stay with the company? Benevity’s research shows that employees are 57% more likely to stay with a company which offers volunteering and fundraising opportunities, leading to significant cost reductions



Why CIOs should focus on trimming their internal email footprint


Reducing business’ reliance on email is just one part of a wider shift in the way companies need to operate going forward. Stanley Louw, UK and Ireland head of digital and innovation at Avanade, believes organisations need a strategy that is digital, not a digital strategy. “The way we have always provided IT for work is actually holding us back,” he said. “You have to apply the sample principles of customer experience to employee experience. What is the experience employees need to do their job? CIOs have to start partnering with HR.” But in Louw’s experience, IT departments still approach desktop IT from a pure IT perspective, which makes their approach to the desktop archaic, very much based in legacy approaches to desktop management. Industry momentum around focusing on customer experience has changed the way businesses look at their customer, he said, adding: “You also need to look internally and start by modernising platforms.”


Chief Integration Officer!

Chief Integration Officer! - ITNEXT
As organizations go for technology-leveraged strategic transformation, they expect technology to help them maximize business value, as an organization. This is different from better decision-making or operational efficiency or a specific new capability at a functional level. The whole value accrued to the organization must be more than the sum of parts. Someone needs to drive that.  That someone, for a very few selected organizations, is a dedicated Chief Digital Officer. But more than 95% of organizations do not have a CDO role; most of them do not intend to have one. Yet, they still need someone to put all the pieces together to create organizational value. That integration has to be done by someone who thoroughly understands technology and its direction as well as business. In most organizations, CIO is the best person to drive that role.  The reason why it has not happened so widely is not as much because the top management has doubt over CIOs’ capability as it is because the CIOs are not ready to move on from nuts and bolts because that may mean giving up control over a big chunk of budget on IT infrastructure.


Why Proxies Are Important for Microservices


The dynamic nature of microservices applications presents challenges when implementing reverse proxies. Services can come and go as they are revisioned or scaled and will have random IP addresses assigned. The synchronization of the available services and the configuration of the reverse proxy is essential to ensure error-free operation. One solution is to use a service registry (e.g. etcd) and have each service maintain its registration while it is running. The reverse proxy watches the service registry to keep its configuration up to date. Kubernetes does all of this automatically for you as part of its automation. The Kube DNS process maintains the service registry with an address (A) and service (SRV) record for each service. The Kube Proxy process routes and load-balances requests across all instances of the services. With all incoming request traffic for a microservices application typically passing through proxies, it is essential to monitor the performance and health of those proxies. Instana sensors include support for Envoy Monitoring, Nginx Monitoring, and Traefik Monitoring, with more proxy technologies coming.


Browser OS could turn the browser into the new desktop


A potential challenge is the browser becomes the desktop for the end user, and that's something folks have to get used to. But to Google's credit and its partnerships with vendors like VMware or Citrix, the UX challenge becomes almost invisible. We'll see how enterprises continue to approach this opportunity, which is ultimately more secure. For certain use cases, field services for example, if a browser OS-based device either dies or gets broken or lost, no data is lost. A user can just go get a new Chromebook and sign in back where he or she left off. That's an unheard-of value proposition -- that begin-where-you-left-off concept is powerful. One other problem enterprises may face is around Microsoft legacy infrastructure -- particularly around endpoint management. Microsoft has moved away from that to help bridge the divide, and Windows 10 is doing well. We'll see a lot more migration happening this year as the Windows 7 sunset comes closer.


This new Android ransomware infects you through SMS messages


Depending on the infected device's language setting, the messages will be sent in one of 42 possible language versions, and the contact's name is also included in the message automatically.  If the link is clicked and the malicious app is installed manually, it often displays material such as a sex simulator. However, the real purpose is quietly running in the background. The app contains hardcoded command-and-control (C2) settings, as well as Bitcoin wallet addresses, within its source code. However, Pastebin is used by the attackers as a conduit for dynamic retrieval.  Once the propagation messages have been sent, Filecoder then scans the infected device to find all storage files and will encrypt the majority of them. Filecoder will encrypt file types including text files and images but fails to include Android-specific files such as .apk or .dex.  ESET believes that the encryption list is no more than a copy-and-paste job from WannaCry, a far more severe and prolific form of ransomware.  A ransom note is then displayed, with demands ranging from approximately $98 to $188 in cryptocurrency. There is no evidence that files will be lost after the time threatened.


Google bolsters hybrid cloud proposition for enterprises through VMware partnership


Google Cloud CEO Thomas Kurian confirmed the news in a blog post, in which he described the move as “another significant step” in his firm’s drive to bolster the enterprise appeal of its public cloud platform.  In recent years, this has seen the Google Cloud team roll out a series of data security and functionality improvements to its platform, and move to introduce industry-specific services. Such moves have resulted in Google Cloud developing into an $8bn annual revenue run rate company that is growing “at significant pace”, as was confirmed by Google CEO Sundar Pichai during a conference call to discuss its parent company’s second-quarter results earlier this month. “Customers are choosing Google Cloud for a variety of reasons,” said Pichai on the call, transcribed by Seeking Alpha. “Reliability and uptime are critical.” He also made reference to the “flexibility” that organisations also need when moving to the cloud, so they can proceed in their “own way”. 


The Future of API Management

There will be a continued emphasis on the developer and recognizing the developer is king and making the job easier for them. We understood this for the public, but we still need to improve internally. In the form of a service catalog for internal developers to make it easier for them to ramp up and get benefits of existing APIs. New architecture — Everything is driven by containers, container platforms, and K8s is leading to microservices architecture with new approaches to control traffic with sidecar approaches to manage traffic like Envoy and Istio to provide service mesh to manage applications within the container cluster.  As these things come up, there will be a proliferation of types of control points with multiple form factors. We embrace Envoy as the new gateway. Right now we live in a mixed world and it’s important to consider how service mesh and API management will overlap. API management is about the relationship of providing a service and multiple consumers of that service. The more scale, the more important the formal API management platform.


zeroday software bug skull and crossbones security flaw exploited danger vulnerabilities by gwengoa
A zero day is a security flaw that has not yet been patched by the vendor and can be exploited and turned into a powerful weapon. Governments discover, purchase, and use zero days for military, intelligence and law enforcement purposes — a controversial practice, as it leaves society defenseless against other attackers who discover the same vulnerability. Zero days command high prices on the black market, but bug bounties aim to encourage discovery and reporting of security flaws to the vendor. The patching crisis means zero days are becoming less important, and so-called 0ld-days become almost as effective. ... Not all zero days are complicated or expensive, however. The popular Zoom videoconferencing software had a nasty zero day that "allows any website to forcibly join a user to a Zoom call, with their video camera activated, without the user's permission," according to the security researcher's write up. "On top of this, this vulnerability would have allowed any web page to DoS (Denial of Service) a Mac by repeatedly joining a user to an invalid call." The Zoom for Mac client also installs a web server on your laptop that can reinstall the Zoom client without your knowledge if it's ever been installed before.


white blocks stacked containers misaligned alignment fragile falling apart flickr
One thing that is widely agreed upon by the security pros – as Kubernetes adoption and deployment grows, so will the security risks. There have been multiple recent events in the cloud and mobile dev spaces where these environments were compromised by attackers. This included everything from disruption, crypto mining, ransomware, and data stealing. Of course, these types of deployments are just as susceptible to exploits and attacks from attackers and insiders as the traditional environments. Thus, it is more important to ensure your large-scale Kubernetes environment has the right deployment architecture and that you use security best practices for all these deployments. As Kubernetes is more widely adopted, it becomes a prime target for threat actors. “The rapid rise in adoption of Kubernetes is likely to uncover gaps that previously went unnoticed on the one hand, and on the other hand gain more attention from bad actors due to a higher profile,” says Amir Jerbi, CTO at Aqua Security.



Quote for the day:


"Leaders begin with a different question than others. Replacing who can I blame with how am I responsible?" -- Orrin Woodward


Daily Tech Digest - July 02, 2019

TIN coalition calls for industry action against cyber fraud


The vision for overcoming social engineering challenges is to reduce the opportunities to establish false trust and to ensure that all remaining threats are well publicised and understood. The vision also requires organisations to interact with customers and staff in a way that reinforces security and to ensure that the security of interactions with individuals becomes less dependent on public information. To address operating in silos, the vision is to ensure that cyber fraud is understood across functions within and between organisations, to ensure that organisations are recognised for sharing useful information, not punished for suffering an attack, and to ensure that business and law enforcement collaborate effectively to tackle cyber fraud. And to reduce the gap between cyber security and anti-fraud operations, the vision is to ensure that the response to cyber attacks minimises the broader impact of data loss on society, that fraud teams in business and law enforcement are fully engaged in tackling cyber attacks as a precursor to fraud, that enforcement is globalised to tackle all forms of cyber fraud


Big Data Is Dead. Long Live Big Data AI.

Getty
“The value of the data analytics market can’t be ignored. The Looker and Tableau acquisitions demonstrate that even the biggest tech players are snapping up data analytics companies with big price tags, clearly demonstrating the value these companies have in the larger cloud ecosystem. And in terms of what this means for the evolution of AI, we’ve reached a point where we have more than enough anonymized data to train the system, and now it’s a matter of honing how we use the AI to extract the maximum value from data”—Amir Orad, CEO, Sisense “The Google Cloud/Looker and Salesforce/Tableau acquisitions are a direct reaction to the rate at which analytics workloads have been shifting to the cloud over the past few years. The state of AI is a reflection of this shift as machine learning, AI and analytics have become the primary growth opportunities for the cloud today. Yet, it's this same growth that is causing barrier to success as AI project overwhelming face the same problem -- data quality”—Adam Wilson, CEO, Trifacta


What can you do with the Microsoft Graph?


Working with the APIs can be tricky; it can be hard to construct the right query, especially if you're looking for more complex graph queries. Microsoft offers tools to help build and test queries, as well as SDKs that can simplify adding Graph support to your apps. One, the web-based Graph Explorer, allows you to try out queries without logging in to an Office 365 account. It provides sample queries that show how to extract specific information from the service, with a library of different queries to get started. You can only use GET queries against sample data; POST requires your account details and your data. Once you're ready to start working with live data, you can log in with a Microsoft account, and start using your Microsoft 365 tenant. The list of query categories is long, covering working with users, with mail and calendar, as well as files and apps. The Graph Explorer doesn't only show production queries, it supports beta APIs, so you can experiment before adding them in your code. Queries can be cut-and-pasted from the Explorer, and you can see any request headers or bodies that need to be constructed and delivered with the REST HTTP query.



Offensive Security launches OffSec Flex, a new cybersecurity training program

Organizations can now use OffSec Flex to purchase blocks of Offensive Security’s industry-leading practical, hands-on training, certification and virtual lab offerings, allowing them to proactively increase and enhance the level of cybersecurity talent available within their organizations. With Offensive Security’s hands-on courses, labs and exams readily available, organizations are able to offer educational opportunities to new hires and non-security team members alike, improving their security posture and equipping their employees with the adversarial mindset necessary to protect modern enterprises from today’s threats. “Cybersecurity training is not just for security professionals anymore,” said Kerry Ancheta, VP of Worldwide Sales, Offensive Security. “Increasingly we see organizations recommend pentest training courses for their software development or application security teams in order to improve their understanding for how their systems and applications are attacked.


Calculating The Cost of Software Quality in Your Organization


Basically, the costs of software quality (COSQ) are those costs incurred through both meeting and not meeting the customer’s quality expectations. In other words, there are costs associated with defects, but producing a defect-free product or service has a cost as well. Calculating these costs serves the purpose of identifying just how much the organization spends to meet the customer’s expectations, and how much it spends (or loses) when it does not.  Knowing these values allows management and team members across the company to take action in ensuring high quality at a lower cost. While analyzing the COSQ at an organization may lead to the revelation of uncomfortable truths about the state of quality management at the company, the process is important for eliminating waste associated with poor quality. This often requires a mindset and culture shift from viewing software quality defects as individual failures to seeing them as opportunities to improve as a collective team.


Machine learning has been used to automatically translate long-lost languages


It’s not hard to imagine that recent advances in machine translation might help. In just a few years, the study of linguistics has been revolutionized by the availability of huge annotated databases, and techniques for getting machines to learn from them. Consequently, machine translation from one language to another has become routine. And although it isn’t perfect, these methods have provided an entirely new way to think about language. Enter Jiaming Luo and Regina Barzilay from MIT and Yuan Cao from Google’s AI lab in Mountain View, California. This team has developed a machine-learning system capable of deciphering lost languages, and they’ve demonstrated it by having it decipher Linear B—the first time this has been done automatically. The approach they used was very different from the standard machine translation techniques. First some background. The big idea behind machine translation is the understanding that words are related to each other in similar ways, regardless of the language involved.


The Agile Manifesto: A Software Architect's Perspective

Specifications with an architectural impact (in the form of new user stories) should be tracked by the architect and assessed in a pragmatic approach by the whole development team, including experienced developers, test engineers, and devops. Bad habits from the past, when the architect created on paper the full blown technical design for the team, do not fit within modern agile environments. There are multiple flaws with this model, which I also faced in my daily basic work. First and most important, the architect might be wrong. This happened to me after I created a detailed upfront technical design and presented it to development team during Sprint refinements. I got questions related to cases I did not think about or I failed to take into account. In most of the cases, it turned out the initial design was either incomplete or impractical, and required extra work. Big upfront design limits the creativity and autonomy of the team members, since they must follow a recipe which is already granted. From a psychological standpoint, even the author might become biased and more reluctant to change it afterwards, trying to prove it is correct rather than to admit its flaws.


Essential tips for scaling quality AI data labeling


Data scientists are using labeled data and natural language processing (NLP) to automate legal contract review and predict patients who are at higher risk of chronic illness. The success of these systems depends on skilled humans in the loop, who label and structure the data for machine learning (ML). High-quality data yields better model performance. When data labeling is low quality, an ML model will struggle to learn. According to a report by analyst firm Cognilytica, about 80 percent of AI project time is spent on aggregating, cleaning, labeling, and augmenting data to be used in ML models. Just 20 percent of AI project time is spent on algorithm development, model training and tuning, and ML operationalization. These tasks are at the heart of AI development and require strategic thinking, along with a more advanced set of engineering or computer science skills. It’s best to deploy more expensive human resources — such as data scientists and ML engineers — on tasks that require expertise, collaboration, and analytical skills.


Effective or Not? The Real Impact of GDPR


The General Data Protection Regulation wasn’t just meant to give governments the means to enforce data security rules. Another key objective was to change how both companies and users behave when it comes to ensuring personal data remains private and protected. In this sense, GDPR seems to have had the desired impact. ... Another interesting fact the data shows is that users may have moved some of their own responsibility to GDPR enforcers. Two indicators led to this observation: “Respondents are less likely to read privacy statements than they were in 2015 (-7 percentage points) “17% say it is enough for them to see the website has a privacy policy” so they choose not to read the document at all. A similar behavior pattern emerges when dealing with social media usage. Less users – 56% in 2019 vs 60% in 2015 – actually change their privacy settings for their personal profile. The three most common reasons social network users give for not trying to change their personal profile’s default settings are that they trust the sites to set appropriate privacy settings (29%) that they do not know how to (27%), or that they are not worried about sharing their personal data (20%).


5 steps for digital workplace transformation


Start by recognizing actionable opportunities within your business operations. Approach the prospects for digital transformation from a business instead of technology perspective. Line-of-business (LOB) teams should lead this effort, coordinating closely with senior IT staffers to identify critical barriers to success. Of course, each organization faces its own set of challenges. But, at the onset, step back and identify key themes -- accelerating innovation, enhancing productivity, improving governance or reshaping the steps in the customer journey -- that make good business sense. Consider operations as a whole, while focusing on people and processes, and determine your target audiences: employees, partners and/or customers. Then, engage a cross section of these audiences in conversations about what they are doing and how they understand the underlying business purposes. Develop both the technology and the business insights about what is happening from the participants' perspectives. Listen carefully as they describe their tasks, and be sure to observe how they do their work to determine where bottlenecks occur.



Quote for the day:


“The real voyage of discovery consists not in seeking new landscapes but in having new eyes.” -- Marcel Proust


Daily Tech Digest - June 30, 2019

How a quantum computer could break 2048-bit RSA encryption in 8 hours


Shor showed that a sufficiently powerful quantum computer could do this with ease, a result that sent shock waves through the security industry. And since then, quantum computers have been increasing in power. In 2012, physicists used a four-qubit quantum computer to factor 143. Then in 2014 they used a similar device to factor 56,153. It’s easy to imagine that at this rate of progress, quantum computers should soon be able to outperform the best classical ones. Not so. It turns out that quantum factoring is much harder in practice than might otherwise be expected. The reason is that noise becomes a significant problem for large quantum computers. And the best way currently to tackle noise is to use error-correcting codes that require significant extra qubits themselves. Taking this into account dramatically increases the resources required to factor 2048-bit numbers. In 2015, researchers estimated that a quantum computer would need a billion qubits to do the job reliably. That’s significantly more than the 70 qubits in today’s state-of-the-art quantum computers.



How Urbanhire is disrupting HR in Indonesia

Specifically, the hiring platform allows companies to post jobs across more than 50 portals, including Google, LinkedIn and Line - a freeware app which became Japan’s largest social network in 2013. Tapping into a pool of more than one million active jobseekers, the software-as-a-service (SaaS) follows a “data-driven hiring strategy”, aligning businesses to a four-step digital strategy of “source, assess, recruit and on-board”. Three years since launching, key customers include global brands such as AIA, Zurich and The Body Shop, in addition to Indonesian organisations like Danamon, Pertamina and Djarum. “Indonesia is a fantastic opportunity given where it is at from a growth perspective,” Kamstra added. “As a tech entrepreneur, I love the fact that we can use business models that have been successful in more developed countries without a lot of the baggage that comes with historical tech implementations that are no longer sufficient. “I love to use the telecom industry as an example. Indonesia was able to go from little infrastructure to a very modern one by not having gone through all the investment steps that countries like the US were forced to do as pioneers.


Don't Miss These 10 Cybersecurity Blind Spots

uncaptioned
When an employee is terminated, it’s important to shut down their access to all work-related accounts — immediately. Ideally, you might want to try to automate as much of the account-termination process as possible and ensure that the process covers all accounts for all employees. This can be easier said than done, but it's important to get a process or automated solution nailed down before that employee's access causes an unwanted breach. ... Any application that uses third-party software components, including open-source components, takes on the risk of potential vulnerabilities in those dependencies. These vulnerabilities should be identified, tracked and accounted for in the same way as every other software component. ... Service accounts are used by machines, and user accounts are used by humans. The trouble with service accounts is that sometimes they have access to a lot of different systems, and their passwords aren’t always managed well. Poorly managed passwords make for easy compromise by attackers.


Business needs to see infosec pros as trusted advisers

The first issue clouding communication between security professionals and the board or senior business leaders is the misunderstanding that IT risk is separate from business risk. Nothing could be further from the truth, especially considering that in most organisations today, the separation between what is IT and what is business is hard to identify because technology is the backbone of everything the business does. The second issue relates to how the message is packaged. Is the language full of technical jargon, or is it simple to understand and gets the message across in business terms? Does it highlight the loss to the business in terms understood by the board and senior business leaders? Take the example of when business downtime is required when a patch needs to be applied. Instead of talking in terms of the technical threats and the outcomes of poor patching, security professionals would be more effective explaining it in terms of loss to the business, such as lost opportunities or losses from an attack that may occur because of the unpatched status.


MongoDB CEO explains where the company has an edge over database giant Oracle


Cramer noted that Oracle, which has a nearly $195 billion market cap, has recently bought back billions in stock and has a big war chest. Despite that, MongoDB's architecture sets the younger company apart, Ittycheria said. The firm's database is built for the modern world, he added. "[Oracle] built an architecture designed in the late '70s for the world then, and they just tried to make it better over time," he said. "We built an architecture design for today's high performance mobile cloud computing world." Ittycheria explained how MongoDB helped Cisco address an order management application issue in which they receive tens of billions of orders from different sales channels a year. The platform serves more than 14,000 customers, including some of the most "sophisticated, demanding customers in the world." The list ranges from big media to telecom to gaming media to financial services, he said. Start-ups are also developing their business on MongoDB, Ittycheria said.


Fix your cloud security

Fix your cloud security
Enterprises are either not willing to use the right technology, or they don’t understand that the technology exists. It’s not that the database is unencrypted, it’s that nobody has any idea how to turn on encryption in flight or at rest. Also at fault are the “it was not on-premises” folks out there. They cling to the fact that since some security feature was not a part of the original on-premises system, it shouldn’t be needed in the cloud. The time to deal with security issues is when you move from on-premises to the public cloud. You need to spend at least a couple of weeks looking at identity access management, encryption, auditing, proactive security, and more, and then evaluating its viability to your enterprise. Otherwise, you could miss the cloud security boat as you make the migration.  In my opinion, this is the single most important step in migration. It allows you to reflect on what your security needs really are and how to solve them using cloud computing technology which, these days, is better than anything you can find on-premises.


Can Apple compete on privacy?


Apple's privacy campaign has already had an impact in terms of forcing the competition to pay closer attention to their disclosure and controls. It is unlikely to move the needle in terms of market share, but Apple can only gain as awareness of the great data tradeoff of targeted advertising grows and missteps in executing it continues. It should also be more effective as a retention tool for anyone who has not already been locked into Apple's milieu through its self-reinforcing portfolio of devices and growing family of services. Furthermore, while the smartphone market is mature, whatever challenges it as an emerging platform will likely raise even more profound privacy concerns. Already, wearables measure our pulse and assess whether we've fallen, and the kind of personal data that could be generated by measuring exactly what you're looking at via augmented reality gear could make smartphone-generated data seem crude by comparison. And there's another potential benefit to Apple's privacy campaign, one that the company has developed since it first stepped up its advocacy.


Serverless: applications only when you need them - no more, no less

Traditional IT architectures use a server infrastructure, whether on-premises or cloud-based, that requires managing the systems and services required for an application to function. The application must always be running, and the organization must spin up other instances of the application to handle more load which tends to be resource-intensive. Serverless architecture focuses instead on having the infrastructure provided by a third party, with the organization only providing the code for the applications broken down into functions that are hosted by the third party. This allows the application to scale based on function usage and is more cost-effective since the third-party charges for how often the application uses the function, instead of having the application running all the time. ... Serverless computing is constrained by performance requirements, resource limits, and security concerns, but excels at reducing costs for compute. That being said, where feasible, one should gradually migrate over to serverless infrastructure to make sure it can handle the application requirements before phasing out the legacy infrastructure.


Four Myths of Digital Transformation: What Only 8% of Companies Know


New research by Bain & Company finds that only 8% of global companies have been able to achieve their targeted business outcomes from their investments in digital technology. Said another way, more than 90% of companies are still struggling to deliver on the promise of a technology-enabled business model. What secret formula do the 8% deploy? Unsurprisingly, there are no shortcuts or silver bullet. But successful transformations do share some common themes. One of the most important is understanding that this is really a business transformation, supported by investments in new technology—not new technology in search of opportunities. Many executives pay lip service to this idea, but in practice, they delegate too much responsibility to the tech team, hoping the business can watch from the sidelines. At the 8%, executive teams understand that the core of a digital transformation is a business transformation, changing the way of engaging customers across channels, simplifying business processes, and redesigning products or services.


“We need to up our game”—DHS cybersecurity director on Iran and ransomware

Both the Iranian malicious activities and ransomware attacks are largely dependent on exploiting the same sorts of security issues. Both rely largely on the same tactics: malicious attachments, stolen credentials, or brute-force credential attacks to gain a foothold on targeted networks, usually using readily available malware as a foothold to use those credentials to then move across a network. When asked if the recent ransomware attacks on cities across the US (including three recent attacks in Florida with dramatically larger ransom demands) were indicative of a new, more targeted set of campaigns against US local governments, Krebs said that the attacks were likely not targeted—at least not initially. "I still think these [ransomware campaigns] are fairly expansive efforts, where [the attackers] are initially scanning, looking for certain vulnerabilities, and when they find one that's when they start to target," he said. "Again, I'm not sure we have the information right now saying they were specifically targeted.



Quote for the day:


"Leaders stuck in old cow paths are destined to repeat the same mistakes. Change leaders recognize the need to avoid old paths, old ideas and old plans." -- Reed Markham