Technology is so ubiquitous that this a societal problem we all have to reckon with. It’s much more serious than just affecting your family or your company. This is a problem of international magnitude, that has homeland security risks around it. That’s why we wrote the book: The vast majority of our clients still were not listening. They just wanted us for environmental work but they weren’t really sold on the hardware data destruction part of the work yet. We wanted to write this book to share some of examples of serious consequences—that this isn’t some remote, theoretical concern. ... What happens is that guy will pick up the devices for free, put them in a container, and sell them wholesale to the highest bidder. Lots of those buyers are harvesting the precious metals and materials out of old electronics — but there are also people adverse for homeland security who want to pull out the hard drives and find a way to harm us here in the U.S. or hold corporate data for ransom. From those examples you can see how you need to protect your financial and personal data on an individual level too.
Within the cybersecurity industry, the prevailing mindset is that security practitioners are professionals. Thus, a direct consequence of this mindset is that a college degree is required for many cybersecurity jobs. A recent (ISC2) report indicates that 86% of the current cybersecurity workforce has a bachelor's degree or higher. Furthermore, a quick search on Indeed.com shows about 46,000 cybersecurity jobs, of which 33,000 (>70%) require a degree. However, many cybersecurity practitioners I know would rightfully argue that a college degree isn't needed to do most jobs in cybersecurity, and strict adherence to this requirement disqualifies many deserving candidates. But removing the requirement for a college degree raises the question: Are these actually professional jobs, or should they be recast as vocational jobs? I would argue that these jobs may need to be seen as vocations instead of professions. Although many cybersecurity workers take pride in their professional status, many of their jobs are really vocational in nature and could be filled by those with the appropriate level of vocational training.
To enable true knowledge collaboration and connect employees with the information they require, we must start using the data we have in organizations to draw conclusions, at scale. In doing so, we can connect people with questions to the right colleague(s) with the answer(s). Artificial intelligence has two additional important qualities that help businesses achieve this and overcome the issues with legacy knowledge management to date. First, AI can be taught to forget. This means that not only can AI identify who knows what about a topic, but it can also contextualize that information and recognize when information becomes outdated and redundant, meaning it can ‘forget’ unuseful data as needed. Second, using non-sensitive information drawn from existing tools, AI is able to see through silos. It can use all kinds of information to draw conclusions at scale, creating in one integrated platform a live map or ‘knowledge network’ of who knows what within an organization. In short, using data, AI can build a network of knowledge and expertise in real time.
If you know Lightroom, you will have no problem navigating darktable. Like GIMP, darktable is also open-source. New functionality is added regularly which only increases the appeal. While by no means a beginner software, the interface is sneakily slick for a program with this much power under the hood. Adjusting contrast, brightness and saturation are a breeze, manipulated by simple sliders. The same can be said for achieving perfect shadows and highlights, modifying the graduated density of your image, or adding grain. Do not be fooled, though: just beyond those simple controls lies a wealth of robust tools for more advanced users ... RawTherapee is an open-source cross-platform photo editor that offers a non-destructive, 32-bit engine and utilizes powerful algorithms to help you develop the highest quality image possible. If GIMP is Photoshop, think of RawTherapee as Lightroom. While more useful as a processing tool in conjunction with another editing application, RawTherapee is still a perfectly functional editor in its own right, offering several features familiar to Photoshop users.
Ethereum developers, for instance, have consistently stressed security over speed while making sure the network doesn't have any downtime. By contrast, the Solana network shut down for almost 18 hours in September because it was unable to handle high transaction volumes. Kline told Decrypt, "At the end of the day, chain security is incredibly important for financial transactions and for the foreseeable future Ethereum has the most security.” According to Kline, DeFi projects on other blockchains are "heavily driven by token incentives," meaning that people receive tokens that they can then trade or sell as a reward for participating. "Once Ethereum layer 2 adopts those same incentives, we are likely to see a lot more DeFi activity on Ethereum," she said. But the head of public affairs for Parity, which built Polkadot, believes developers are getting tired of waiting for Ethereum 2.0 to be fully ready. "The Ethereum sharding roadmap has changed so many times it is difficult to understand what is actually going to happen and when," said Peter Mauric.
Binghampton University Professor Sang Won Yoon explained this in detail: "With the rapid technology development, such as the Industrial Internet of Things, big data analysis, cloud computing, artificial intelligence, many manufacturing processes can be more intelligent, and Industry 4.0 can then be realized in the near future … . Data-driven solutions, such as AI and machine-learning algorithms, can be applied to diagnose abnormal defects and adjust optimal machine parameters in response to unexpected changes/situations during production. Smart manufacturing adopts real-time decision-making based on operational and inspectional data and integrates the entire manufacturing process as a 'unified framework.'" ... Imagine a series of closed-loop systems distributed at the enterprise edge that can "run themselves" in a closed environment, much like a mini-network. This could reduce present resource stressors, like challenges in managing and paying for large data payloads that continuously stream over communications lines to data centers and clouds.
Like so many other technologies, what is better for some companies is not for others. Both platforms are excellent for building, deploying, and managing containerized applications. Kubernetes is great for intensive-use apps that require regular updates, like games. OpenShift may be the right option for security-strict, GDPR compliant, heavy-duty apps like those of institutional or governmental character, or healthcare. Self-hosted Kubernetes is more complex to install, manage, and monitor without third-party integrations. OpenShift seems to be an easier option to manage with its many built-in features,
but it is limited to Red Hat Linux distributions. At the core, OpenShift is built on a Kubernetes layer but brings additional features that make it a different flavor of container orchestration. Enterprises can benefit from the dedicated support provided by an OpenShift subscription. Still, Kubernetes may be the best option if companies have a skilled container orchestration team, avoiding subscription costs. Kubernetes and OpenShift are two excellent options. Do you know which suits your project best?
Multiple Kronos platforms have been unavailable since December 11. The outage has left millions of users at tens of thousands of customers unable to check pay, arrange rotas, or request paid leave. The issue has bedevilled IT teams globally who’ve been forced to spend time in early 2022 supporting their companies with Excel-based workarounds provided by UKG and other related HR/payroll issues. In the US public sector alone, the New York Metropolitan Transportation Authority, the City of Cleveland, the state of West Virginia, the Oregon Department of Transportation, the University of California system, and Honolulu’s EMS and Board of Water Supply, along with scores of smaller local authorities have been affected. ... Given these previous claims, many customers have been asking why restoration is taking so long. Asked why it was taking so long to restore customer data, the company said that it “employs a variety of redundant systems and disaster recovery protocols. In addition to several redundant data centers, UKG Kronos Private Cloud environments are backed up on a weekly basis, as well as on a daily basis with the delta from the previous day.
A lot of the technology we’re using at Bees is at the bleeding edge of machine-learning research, which requires us to build advanced and custom machine learning systems. Out-of-the-box models and autoML systems like DataRobot are fantastic at democratizing access to machine learning and making it easy and inexpensive to deploy but are not well suited for places where a higher-performing model matters. Bees operates in 13 distinct markets, selling a complex product and customer portfolio, against a changing backdrop of shifting consumer preferences, price elasticity, and supply-chain shocks exacerbated in a post-COVID-19 macro landscape. For the use cases we’re tackling with the Bees team, the incremental impact of algorithmic selling is so significant, that it more than justifies the development and fine-tuning of advanced active learning models. That being said, we are huge fans of open source ML tooling and are power users of many of the biggest frameworks – e.g., PyTorch, Scikit-Learn, Pandas, etc. – pushing these tools as far as they can take us and filling in the gaps ourselves whenever it is necessary.
Digital transformation has put IT front and center in nearly every organization, which has made the job of protecting the infrastructure much more complicated. The growing importance of data as the lifeblood of business, the fundamental shifts in infrastructure with the emphasis on cloud and mobile computing, and the resulting target adjustments by cybercriminals and nation-state attackers has moved the spotlight away from the network. So, who should be in charge? Can cybersecurity responsibility be split up between the CIO and the CISO? Can they somehow share security duties? No, not effectively. To borrow the old phrase about starting quarterbacks in football, if you have two security chiefs, you really have no security chief. It’s time for businesses and other organizations to seriously consider having their CIO report to the CISO. ... The IT infrastructure – and more specifically the lack of visibility into it — is the biggest weak spot in enterprise security. We’ve gotten to a point where attackers know a company’s network better than the security professionals tasked with protecting it.
Quote for the day:
"Personal leadership is the process of keeping your vision and values before you and aligning your life to be congruent with them." -- Stephen R. Covey