Showing posts with label Memory. Show all posts
Showing posts with label Memory. Show all posts

Daily Tech Digest - January 22, 2026


Quote for the day:

"Lost money can be found. Lost time is lost forever. Protect what matters most." -- @ValaAfshar



PTP is the New NTP: How Data Centers Are Achieving Real-Time Precision

Precision Time Protocol (PTP) – an approach that is more complex to implement but worth the extra effort, enabling a whole new level of timing synchronization accuracy. ... Keeping network time in sync is important on any network. But it’s especially critical in data centers, which are typically home to large numbers of network-connected devices, and where small inconsistencies in network timing could snowball into major network synchronization problems. ... NTP works very well in situations where networks can tolerate timing inconsistencies of up to a few milliseconds (meaning thousandths of a second). But beyond this, NTP-based time syncing is less reliable due to limitations ... Unlike NTP, PTP doesn’t rely solely on a server-client model for syncing time across networked devices. Instead, it uses time servers in conjunction with a method called hardware timestamping on client devices. Hardware timestamping involves specialized hardware components, usually embedded in network interface cards (NICs), to track time. Central time servers still exist under PTP. But rather than having software on servers connect to the time servers, hardware devices optimized for the task do this work. The devices also include built-in clocks, allowing them to record time data faster than they could if they had to forward it to the generic clock on a server.


Why AI adoption requires a dedicated approach to cyber governance

Today enterprises are facing unprecedented internal pressure to adopt AI tools at speed. Business units are demanding AI solutions to remain competitive, drive efficiency, and innovate faster. But existing cyber governance and third-party risk management processes were never designed to operate at this pace. ... Without modernized cyber governance and AI-ready risk management capabilities, organizations are forced to choose between speed and safety. To truly enable the business, governance frameworks must evolve to match the speed, scale, and dynamism of AI adoption – transforming security from a gatekeeper into a business enabler. ... What’s more, compliance doesn’t guarantee security. DORA, NIS2, and other regulatory frameworks set only minimum requirements and rely on reporting at specific points in time. While these reports are accurate when submitted, they capture only a snapshot of the organization’s security posture, so gaps such as human errors, legacy system weaknesses, or risks from fourth- and Nth-party vendors can still emerge afterward. What’s more, human weakness is always present, and legacy systems can fail at crucial moments. ... While there’s no magic wand, there are tried-and-tested approaches that resolve and mitigate the risks of AI vendors and solutions. Mapping the flow of data around the organization helps reveal how it’s used and resolve blind spots. Requiring AI tools to include references for their outputs ensures that risk decisions are trustworthy and reliable.


What CIOs get wrong about integration strategy and how to fix it

As Gartner advises, business and IT should be equal partners in the definition of integration strategy, representing a radical departure from the traditional IT delivery and business “project sponsorship” model. This close collaboration and shared accountability result in dramatically higher success rates ... A successful integration strategy starts by aligning with the organization’s business drivers and strategic objectives while identifying the integration capabilities that need to be developed. Clearly defining the goals of technology implementation, establishing governance frameworks and decision-making authority and setting standards and principles to guide integration choices are essential. Success metrics should be tied to business outcomes, and the integration approach should support broader digital transformation initiatives. ... Create cross-functional data stewardship teams with authority to make binding decisions about data standards and quality requirements. Document what data needs to be shared between systems, which applications are the “source of truth.” Define and document any regulatory or performance requirements to guide your technical planning. ... Integrations that succeed in production are designed with clear system-of-record rules, traceable transactions, explicit recovery paths and well-defined operational ownership. Preemptive integration is not about reacting faster — it’s about ensuring failures never reach the business.


CFOs are now getting their own 'vibe coding' moment thanks to Datarails

For the modern CFO, the hardest part of the job often isn't the math—it's the storytelling. After the books are closed and the variances calculated, finance teams spend days, sometimes weeks, manually copy-pasting charts into PowerPoint slides to explain why the numbers moved. ... Datarails’ new agents sit on top of a unified data layer that connects these disparate systems. Because the AI is grounded in the company’s own unified internal data, it avoids the hallucinations common in generic LLMs while offering a level of privacy required for sensitive financial data. "If the CFO wants to leverage AI on the CFO level or the organization data, they need to consolidate the data," explained Datarails CEO and co-founder Didi Gurfinkel in an interview with VentureBeat. By solving that consolidation problem first, Datarails can now offer agents that understand the context of the business. "Now the CFO can use our agents to run analysis, get insights, create reports... because now the data is ready," Gurfinkel said. ... "Very soon, the CFO and the financial team themselves will be able to develop applications," Gurfinkel predicted. "The LLMs become so strong that in one prompt, they can replace full product runs." He described a workflow where a user could simply prompt: "That was my budget and my actual of the past year. Now build me the budget for the next year."


The internet’s oldest trust mechanism is still one of its weakest links

Attackers continue to rely on domain names as an entry point into enterprise systems. A CSC domain security study finds that large organizations leave this part of their attack surface underprotected, even as attacks become more frequent. ... Large companies continue to add baseline protections, though adoption remains uneven. Email authentication shows the most consistent improvement, driven by phishing activity and regulatory pressure. Organizations still leave email domains partially protected, which allows spoofing to persist. Other protections see much slower uptake. ... Consumer oriented registrars tend to emphasize simplicity and cost. Organizations that rely on them often lack access to protections that limit the impact of account compromise or social engineering. Risk increases as domain portfolios grow and change. ... Brand impersonation through domain spoofing remains widespread. Lookalike domains tied to major brands are often owned by third parties. Some appear inactive while still supporting email activity. Inactive domains with mail records allow attackers to send phishing messages that appear associated with trusted brands. Others are parked with advertising networks or held for later use. A smaller portion hosts malicious content, though dormant domains can be activated quickly. ... Gaps appear in infrastructure related areas. DNS redundancy and registry lock adoption lag, and many unicorns rely on consumer grade registrars. These limitations become more pronounced as operations scale.


Misconfigured demo environments are turning into cloud backdoors to the enterprise

Internal testing, product demonstrations, and security training are critical practices in cybersecurity, giving defenders and everyday users the tools and wherewithal to prevent and respond to enterprise threats. However, according to new research from Pentera Labs, when left in default or misconfigured states, these “test” and “demo” environments are yet another entry point for attackers — and the issue even affects leading security companies and Fortune 500 companies that should know better. ... After identifying an exposed instance of Hackazon, a free, intentionally vulnerable test site developed by Deloitte, during a routine cloud security assessment for a client, Yaffe performed a five-step hunt for exposed apps. His team uncovered 1,926 “verified, live, and vulnerable applications,” more than half of which were running on enterprise-owned infrastructure on AWS, Azure, and Google Cloud platforms. They then discovered 109 exposed credential sets, many accessible via a low-priority lab environment, tied to overly-privileged identity access management (IAM) roles. These often granted “far more access” than a ‘training’ app should, Yaffe explained, and provided attackers:Administrator-level access to cloud accounts, as well as full access to S3 buckets, GCS, and Azure Blob Storage; The ability to launch and destruct compute resources and read and write to secrets managers; Permissions to interact with container registries where images are stored, shared, and deployed.


Cyber Insights 2026: API Security – Harder to Secure, Impossible to Ignore

“We’re now entering a new API boom. The previous wave was driven by cloud adoption, mobile apps, and microservices. Now, the rise of AI agents is fueling a rapid proliferation of APIs, as these systems generate massive, dynamic, and unpredictable requests across enterprise applications and cloud services,” comments Jacob Ideskog ... The growing use of agentic AI systems and the way they act autonomously, making decisions and triggering workflows, is ballooning the number of APIs in play. “It isn’t just ‘I expose one billing API’,” he continues, “now there are dozens of APIs that feed data to LLMs or AI agents, accept decisions from AI agents, facilitate orchestration between services and micro-apps, and potentially expose ‘agentic’ endpoints ... APIs have been a major attack surface for years – the problem is ongoing. Starting in 2025 and accelerating through 2026 and beyond, the rapid escalation of enterprise agentic AI deployments will multiply the number of APIs and increase the attack surface. That alone suggests that attacks against APIs will grow in 2026. But the attacks themselves will scale and be more effective through adversaries’ use of their own agentic AI. Barr explains: “Agentic AI means that bad actors can automate reconnaissance, probe API endpoints, chain API calls, test business-logic abuse, and execute campaigns at machine scale. Possession of an API endpoint, particularly a self-service, unconstrained one, becomes a lucrative target. And AI can generate payloads, iterate quickly, bypass simple heuristics, and map dependencies between APIs.”


Complex VoidLink Linux Malware Created by AI

An advanced cloud-first malware framework targeting Linux systems was created almost entirely by artificial intelligence (AI), a move that signals significant evolution in the use of the technology to develop advanced malware. VoidLink — comprised of various cloud-focused capabilities and modules and designed to maintain long-term persistent access to Linux systems — is the first case of wholly original malware being developed by AI, according to Check Point Research, which discovered and detailed the malware framework last week. While other AI-generated malware exists, it's typically "been linked to inexperienced threat actors, as in the case of FunkSec, or to malware that largely mirrored the functionality of existing open-source malware tools," ... The malware framework, linked to a suspected, unspecified Chinese actor, includes custom loaders, implants, rootkits, and modular plug-ins. It also automates evasion as much as possible by profiling a Linux environment and intelligently choosing the best strategy for operating without detection. Indeed, as Check Point researchers tracked VoidLink in real time, they watched it transform quickly from what appeared to be a functional development build into a comprehensive, modular framework that became fully operational in a short timeframe. However, while the malware itself was high-functioning out of the gate, VoidLink's creator proved to be somewhat sloppy in their execution.


What’s causing the memory shortage?

Right now, the industry is suffering the worst memory shortage in history, and that’s with three core suppliers: Micron Technology, SK Hynix, and Samsung. TrendForce, a Taipei-based market researcher that specializes in the memory market, recently said it expects average DRAM memory prices to rise between 50% and 55% this quarter compared to the fourth quarter of 2025. Samsung recently issued a similar warning. So what caused this? Two letters: AI. The rush to build AI-oriented data centers has resulted in virtually all of the memory supply being consumed by data centers. AI requires massive amounts of memory to process its gigantic data sets. A traditional server would usually come with 32 GB to 64 GB of memory, while AI servers have 128 GB or more. ... There are other factors at play here, too, of course. The industry is in a transition period between DDR4 and DDR5, as DDR5 comes online and DDR4 fades away. These transitions to a new memory format are never quick or easy, and it usually take years to make a full shift. There has also been increased demand from both client and server sides. With Microsoft ending support for Windows 10, a whole lot of laptops are being replaced with Windows 11 systems, and new laptops come with DDR5 memory — the same memory used in an AI server. ... “What’s likely to happen, from a market perspective, is we’ll see the market grow less in ’26 than we had anticipated, but ASPs are likely to stay or increase. ...” he said.


OpenAI CFO Comments Signal End of AI Hype Cycle

By focusing on “practical adoption,” OpenAI can close the gap between what AI now makes possible and how people, companies, and countries are using it day to day. “The opportunity is large and immediate, especially in health, science, and enterprise, where better intelligence translates directly into better outcomes,” she noted. “Infrastructure expands what we can deliver,” she continued. “Innovation expands what intelligence can do. Adoption expands who can use it. Revenue funds the next leap. This is how intelligence scales and becomes a foundation for the global economy.” The framing reflects a shift from big-picture AI promise to day-to-day deployment and measurable results. ... There’s also a gap between what AI can do and how people are actually using it in daily life, noted Natasha August, founder of RM11, a content monetization platform for creators in Carrollton, Texas. “AI tools are incredibly powerful, but for many people and businesses, it’s still unclear how to turn that power into something practical like saving time, making money, or improving how they work,” she told TechNewsWorld. In business, the gap lies between AI’s raw analytical capabilities and its ability to drive tangible, repeatable business outcomes, maintained Nithin Mummaneni ... “The winning play is less ‘AI that answers’ and more ‘AI that completes tasks safely and predictably,'” he continued. “Adoption happens when AI becomes part of the workflow, not a separate destination.”

Daily Tech Digest - April 13, 2020

Banks should be cautious with use of AI in cybersecurity
Financial institutions must be prepared however for cybercriminals’ methods countering new defences with continuing evolving means of their own. Instead of executing cyberattacks with the intention of stealing money or making fraudulent payments, cyber criminals may target the machine learning processes, embedding fraudulent mechanics into the way the AI engines work. “One of the big concerns, especially at the regulatory level for the future, is ultimately the underlying data integrity,” Holt says. “So, if the attackers don’t do big enormous payouts immediately but attempt to alter the underlying data, how would that be spotted?” Therein lies the danger for financial services companies which are overly optimistic about the potentials for AI in cybersecurity. Dries Watteyne, head of SWIFT’s cyber fusion centre, urges caution in this area. “When talking about the potential of machine learning, I think we shouldn’t forget everything we achieved to date without it.”


Windows Subsystem for Linux 2 Moving into General Availability

As discussed previously, WSL2 is a change in architecture from WSL 1. Where WSL 1 required a translation layer between the Linux system calls and the Windows NT kernel, WSL 2 ships with a lightweight VM running a full Linux kernel. This VM runs directly on the Windows Hypervisor layer. This kernel includes full system call compatibility and allows for running apps like Docker and FUSE natively on Linux. With this new implementation, the Linux kernel has full access to the Windows file system. This new release brings large improvements to performance especially for interactions that require accessing the file system. According to Craig Loewan, Program Manager at Microsoft, this could be between a 3 to 6 times performance improvement depending on how file intensive the application is. He further mentions that unzipping tarbars could see a 20 times performance increase. With this upcoming new version of Windows 10, currently known as version 2004, Microsoft has indicated that the installation and updating process of WSL2 will be streamlined.


Zoom vs. Microsoft Teams: Video chat apps for working from home, compared


Teams has a similar feel to Slack -- you can talk to team members privately or in specific channels, and you can call attention to the whole group or just an individual with the mention feature.  You can video chat with up to 250 people at once with Teams, or present live to up to 10,000 people. Share meeting agendas prior to a conference, invite external guests to join a meeting, and access past meeting recordings and notes. Meetings can be scheduled in the Teams app or through Outlook. ... The Zoom video conference app works for Android, iOS, PC and Mac. The app offers a basic free plan that hosts up to 100 participants. There are also options for small and medium business teams ($15-$20 a month per host) and large enterprises for $20 a month per host with a 50-host minimum. You can adjust meeting times, and select multiple hosts. Up to 1,000 users can participate in a single Zoom video call, and 49 videos can appear on the screen at once. The app has HD video and audio capabilities, collaboration tools like simultaneous screen-sharing and co-annotation, and the ability to record meetings and generate transcripts.



Creating a Text-to-speech engine with Google Tesseract and Arm NN on Raspberry Pi

The network’s architecture can be divided into three significant steps. The first one takes the input image and then extracts features using several convolutional layers. These layers partition the input image horizontally. For each partition, these layers determine the set of image column features. The sequence of column features is used in the second step by the recurrent layers. The recurrent neural networks (RNNs) are typically composed of long short-term memory (LTSM) layers. LTSM revolutionized many AI applications, including speech recognition, image captioning, and time-series analysis. OCR models use RNNs to create the so-called character-probability matrix. This matrix specifies the confidence that the given character is found in the specific partition of the input image. Thus, the last step uses this matrix to decode the text from the image. Usually, people use the connectionist temporal classification (CTC) algorithm. The CTC aims at converting the matrix into the word or sequence of words that makes sense.


Collaboration answers the call

tech spotlight collaboration ctw intro by  ipopba 887088424 3x2 2400x1600
Senior Reporter Matthew Finnegan, who covers collaboration for Computerworld, addresses the question in the back of everyone's mind: "Remote working, now and forevermore?" Surveys show that the majority of people prefer to work from home — and in organizations that have had mature work-from-home policies for a while, many employees have settled into their new reality as if it's no big deal. The office won't go away overnight. But as long as productivity endures, and as collaboration tools inevitably improve, why not allow people to work wherever they like? Matthew and IDG TechTalk's Juliet Beauchamp discuss these and other possibilities on a special episode of Today in Tech. One thing's for sure: Videoconferencing is proving itself the lifeblood of remote work. But can networks handle it? By all accounts, the public internet and even cloud services have held up remarkably well. Yet as analyst Zeus Kerravala observes in "Videoconferencing quick fixes need a rethink when the pandemic is over," written by Network World contributor Sharon Gaudin, those who return to the office and want to continue Zooming or Webexing could face obstacles.


Why Industries Should Prepare For Mass Blockchain Adoption

Photo:
First and foremost, the token market is likely to be significantly reduced this year, and only the most highly demanded and well-developed projects will remain as the digital assets traded on exchanges that are increasingly being forced to comply with legal requirements. Another change this year will be a gradual transition to turnkey solutions. The idea of blockchain turnkey solutions was first presented by Bitwings, an official blockchain-based solution of the leading Spanish mobile operator Wings Mobile. Its goal was to create the most secure standards for e-devices without compromising the operating system and its performance. To integrate turnkey solutions, companies need to conduct internal research: analysis of the current market and existing problems, and the potential of the blockchain in different sectors. It’s also worth studying the existing centralized and decentralized solutions and deciding how to integrate the solution into production processes without disrupting their performance. The latter is the most important point; it is one that all executive officers should pay attention to. They must consider the most efficient options for integrating blockchain into their working processes.


DDR5 memory promises a significant speed boost

Samsung DRAM
"What's important about DDR5 is the high level of integration it provides," says Jim Handy, principal analyst with Objective Analysis, an analyst firm specializing in the memory market. "The people who defined this spec took advantage of the fact that Moore's Law not only reduces DRAM's price per bit, but it also makes it cheaper to add increasing amounts of powerful logic to the chip. They have artfully used this to improve the CPU-DRAM bandwidth, to move the Memory Wall a little farther out." The Same Bank Refresh is a good example, Handy says. "For DRAM's entire history a chip couldn't provide data while it was being refreshed. Now Same Bank Refresh allows data to be accessed in banks that aren't undergoing refresh. This does a lot to improve data communication." So when will this start to show up? Last year an Intel roadmap was leaked to the hobbyist press that showed Intel was planning to move to DDR5 and PCI Express 5 (completely skipping PCIe v4) in 2021. Micron has begun sampling DDR5, Hynix said it plans to begin volume production at the end of this year, and Samsung plans to start DDR5 production next year.


Don’t Leave “Ethical Tech” Out of Your Digital Transformation Plan

Few organizations and their leaders develop an overall approach to the ethical impacts of technology use—at least not at the start of a digital transformation. In a recent study, just 35 percent of respondents said their organization’s leaders spend enough time thinking about and communicating the impact of digital initiatives on society. But in order to be truly savvy in the age of advanced, connected, and autonomous technologies, leaders must think beyond designing and implementing technologically driven capabilities. They should consider how to do so responsibly from the start. At Deloitte, we see a relationship between a company’s digital and technological progress—in other words, its tech savviness—and its focus on various ethical issues related to technology. Our research found that 57 percent of respondents from organizations considered to be “digitally maturing” say their organization’s leaders spend adequate time thinking about and communicating digital initiatives’ societal impact, compared with only 16 percent of respondents from companies in the early stages of their digital transformation.


Duplication, fragmentation hamper interoperability efforts, impact patient safety


Duplicate records might also contain incomplete or outdated information and can affect the quality of care by forcing clinicians to make care decisions without important information such as recent lab results, allergies and current medications. Back in 2019, Verato and AdVault partnered on a cloud-based patient matching platform which aims to expand secure identity matching so care teams have seamless access to medical records. Patient matching specialist Verato, which has also partnered with healthcare IT security specialist Imprivata, is of the belief that alignment of disparate patient record platforms will help eliminate duplicate records, establish more accurate care histories and improve patient safety. In a 2016 Ponemon Institute survey, 86% of respondents said they witnessed a medical error as a direct result of misidentification, and indicated that 35% of all denied claims are due to misidentification, which can cost hospitals up to $1.2 million a year. "Many systems still do not communicate and store data in disjointed architectures and an upsurge of identifiers continues to be created," Doug Brown, managing partner of Black Book, said in a statement.


COBOL, COVID-19, and Coping with Legacy Tech Debt

Image: Alexander - stock.adobe.com
With a history that stretches back three generations, COBOL was developed for a different breed of compute, Edenfield says. “These were massive machines that did certain things like number crunching,” he says. “It wasn’t fancy.” COBOL was designed to move across multiple machines and frankly to be readable, Edenfield says. “People could learn it quickly and it was easier than an assembly language where you are programming in very cryptic commands.” As new compute demands emerged, programming languages evolved, Edenfield says. Agile development and other modern processes can be more efficient and fundamentally different from how COBOL and other early programming languages were handled. Despite such advances, it is a challenge to escape those legacy roots. “Because COBOL was so prevalent, they can’t get out of it,” he says. “There’s so much of it. It’s running all the backroom, payment processing for all your major financial institutions; all your big companies have it.” It was common for organizations to constantly build up COBOL-based systems for decades, Edenfield says, with the programmers retiring or moving on. “Pretty soon, the people who wrote the systems aren’t there anymore,” he says.



Quote for the day:


"Many people go fishing all of their lives without knowing it is not fish they are after." -- Henry David Thoreau


December 15, 2014

From Police to Partner: The Changing Role of IT
As an IT professional, it’s your job to equip employees with the tools they need to get their jobs done while policing to make sure all solutions meet security or compliance requirements for the business. Managing employees is especially important with the emergence of BYOIT and as new web-based services gain traction in consumer and business markets — you don’t want them to circumvent your policies when they use their favorite tools, after all.  But how do you get employees on board when you can’t lock them down anymore? You must adopt a new role that strikes a balance between employee needs and preferences and security. You have to become a “partner.”


Three IT Roles at Threat from Self-Service Business Intelligence
Something has to change when decisions are reliant on a team of many because it’s simply not sustainable. It’s too costly when non-technical employees (95% of an organization’s staff) have no way of creating views or information dashboards that integrate all of their relevant data in a unified business intelligence platform. So, going back to my question about how many decision-makers are required to produce a dashboard, the answer really comes down to every organization and its comfort level in empowering employees with the right tools to integrate, cleanse and enrich data themselves.


Cloud Compliance Remains a Challenge
A technical interpretation of the data protection law would solve the problem. Analogously, a meaningful technical solution does not have to stand in the way of unfashionable, non-IT oriented law. That sounds compelling. A revision of the data protection law would thus not be necessary at all. Caution is called for once again: As opposed to the copyright law, data protection law is not a commercial law. Data protection is a personal right. Hence, the interests of the citizens in data protection principally ranks behind a technical and thus economy-friendly interpretation of the law. As a result, the issue of control and data sovereignty on the cloud therefore remains unresolved to date.


A Terabyte on a Postage Stamp: RRAM Heads into Commercialization
Because of its greater density, RRAM will be able to use silicon wafers that are half the size used by current NAND flash fabricators. In a single chip, it has nearly 10 times the capacity of NAND flash and uses 20 times less power to store a bit of data. It also sports 100 times lower latency than NAND flash, meaning performance is massively improved, according to Crossbar. And because RRAM is fully compatible with the standard manufacturing processes already used in NAND fabrication, no changes will be needed in manufacturing facilities. But before it could send its technology to the factory, Crossbar had to overcome a major technological hurdle -- error-causing electron leaks between memory cells.


Government IT In 2014: GAO's Critique
The General Accountability Office has always been a reliable resource for seeing what IT dilemmas the federal government is grappling with. Through its reports and testimony, the GAO seeks to help the feds keep IT projects on schedule, maintain high levels of security, meet statutory requirements, and make the most of their investments. The GAO produced 31 reports and testimony on IT in 2014. While some reports focused on mundane IT matters, others addressed emerging technologies or uncovered government-wide IT deficiencies that merit inclusion in this roundup.


10 cybersecurity predictions for 2015
Year end is a time for reflection. Based on my history in this space, plus the fact that my day job of running CSC's Global Cybersecurity Consulting business lets me talk to and help hundreds of executives around the world, I wanted to offer my perspective on how 2014 turned out and my thoughts on what to watch for in 2015. ... 2014 had both high- and low-profile attacks against industrial control and SCADA systems, and it continues to be a head-to-head battle where the atom meets the bit.


Does the world need 5G? Driverless cars, IoT, future devices will demand it
5G probably won't diverge from the age-old pattern, but it does come with one added hassle: we just don't have enough spectrum to go around any longer, according to wireless analysts. Roaming in particular could be problematic. "Spectrum is and will remain a major challenge for the success and early rollout of 5G. We don't have enough spectrum in general and 5G is a lot about optimising the use of spectrum. But clearly, allocating more spectrum to 4G and later 5G would help and this is a global challenge... An additional challenge will be to find a globally harmonised band for 5G roaming since all suitable spectrum is already in use in one or another part of the world," said Thibaut Kleiner, head of the European Commission's CONNECT Directorate-General.


Top 10 Big Data Predictions For 2015
Big data has seen a massive growth in interest in recent times, as more and more companies are investing in various facets of this technology. While this year, businesses’ understanding and willingness to explore big data opportunities have matured from the previous years, the coming year is expected to be even more critical, believe analysts. IT market research agency IDC has shared top 10 predictions for Big Data and analytics segment. These predictions will help IT leaders and CIOs to come up with better strategies in 2015, states the research firm.


6 IT Workforce Predictions for 2015
2015 promises to be a banner year for IT workers as the unemployment rate continues to plummet, salaries increase and organizations double down on retention and engagement strategies. CIO.com asked experts to predict the biggest trends, technology and strategies that will make an impact on hiring and recruiting in 2015. Every new year brings a unique set of challenges and opportunities for IT workers as existing technologies evolve and new technologies emerge. The first half of 2015 looks promising based on these six predictions from career experts.


Secret CIO: Stop Making Stupid Software Decisions
Most LOB experts focus on the here and now. That's what lines of business are all about. But we make major software investments for the future, for requirements we don't necessarily have yet, for the business we want to create. It's difficult for most LOB managers to step into a software assessment project and shift their perspective. They're not being replaced during the evaluation, so they're distracted by present-day work. The three major players in this type of software project have three different objectives. The company wants to power current and future business capabilities -- to increase customer value and create competitive advantage.



Quote for the day:

"The most important quality in a leader is that of being acknowledged as such." -- Andre Mauroisv

July 12, 2012

4 strategies to combat healthcare fraud
The GAO estimates that in 2010 more than $70 billion in improper payments were made by the federal government within the Medicare and Medicaid programs alone. According to the National Healthcare Antifraud and Abuse Association, between three and ten percent of all healthcare spending is lost to fraud.

Boost Productivity by 'Managing' Your Energy
Running a business is physically and mentally demanding. In many cases it comes with long hours accompanied by high levels of stress. And yet, there’s always more to get done than time and energy seems to permit.

When Smart People Get Stupid
When your "go-to" person becomes less reliable, the whole business starts to suffer ... smart people get stupid when they are either overwhelmed or underwhelmed.


How to Evolve from an Employee into a Consultant
If you’re passionate about your job — yet would prefer to be your own boss — shifting into a consulting role may be the ideal path to becoming a small-business owner.


Toshiba Developing New Data Center Products Based on its Flash Memory Chips
Toshiba said Tuesday it is developing a line of products for storing large amounts of data, based on its flash memory chips.


Twitter reveals the anatomy of its major mobile overhaul
Twitter did a complete work-over designing, prototyping, developing, testing, and calibrating its mobile system during the overhaul


VMware Zimbra Unifies Business Email, Voice, Texting in Cloud Beta Version
Zimbra Collaboration Server 8.0 unifies key business communication channels -- email, messaging, calendaring, voice, and collaboration workflow -- into a single cloud-based service.


Risk-storming
Like estimates, risks in the real world are rather more subjective than they ought to be and for this reason alone, it makes sense to get input from more than just a single person, especially if that person is the solitary software architect on the team.

Microsoft Surface will be a real iPad rival in the enterprise, say CIOs
... All three groups voted yes, albeit by different margins, reflecting the Surface tablet could be a genuine challenger to the iPad, Apple's hugely successful tablet that has not faced much serious competition until now ...


Thinking is dead
The value of 'thought' and thinking in IT has diminished, in a way that mirrors society at large, to the stage where design, planning, architecture and anything else other than just banging away at a keyboard appear to have been relegated behind opinions and statements as fact.

Scrum Extensions "Suspended"
Scrum extensions have been “suspended”. That’s the most recent communication from Scrum.org on the community proposed, but controversial, new adjuncts to the scrum methodology. 



Quote for the day:

"Some defeats are only installments to victory" - Jacob