How file transfer technology is changing in 2024
As we mark over 50 years since the advent of file transfer protocol (FTP), one might assume that file transfer technologies are just relics of the past. But in 2024, that could not be farther from the truth.
Modern businesses are inundated with data and rely on a complex network of data-intensive applications to achieve crucial operational goals. High-level security is now a fundamental expectation rather than a luxury. Instead of batch processes occurring once daily, there is a demand for real-time, event-driven data flows. Keeping pace with technological advancements requires that legacy systems — and their stewards —evolve and adapt.
Managed file transfer (MFT) technology has become a boardroom topic, with eye-popping fines and lawsuits levied against companies whose MFT solutions failed them. The MIT students who first published these protocols on the eve of the internet couldn’t have imagined the sheer complexity of today’s data landscape — or the increasingly vital role that the file transfer technologies they authored would continue to play in 2024 and beyond.
Security is paramount
Data is the lifeblood of the modern enterprise, and keeping this data secure is more critical than ever. Over the last year, news around data breaches and their vast implications has been hard to miss globally. Thousands of organizations have been impacted, with downstream implications for hundreds of thousands of consumers. It’s likely you’ve received a notice in the mail about being implicated in one of these. Hundreds of class action lawsuits have followed.
These are not trivial matters. The architectural landscape of file transfer has evolved drastically while cyber threats grow more sophisticated. What was a basic FTP transfer job a few years ago now relies on a complex mix of high-trust and low-trust environments, with sophisticated gateways, gateway agents and reverse proxies standing guard over sensitive information.
Encryption standards are evolving rapidly — Pretty Good Privacy (PGP) remains a staple — while new standards such as FIPS 140-3 and even quantum-resistant encryption are on the horizon. MFT platforms must have off-the-shelf integration with tools like Microsoft Azure Key Vault and others, giving users a seamless way to manage encryption keys. This evolution has led to a shift away from FTP to standardized encryption protocols such as SFTP and FTPS, coupled with authentication using Single sign-on (SSO) and multi-factor authentication (MFA). The ball is constantly moving, and file transfer vendors must be agile to stay ahead.
An often forgotten piece of the file transfer landscape is ad-hoc file transfer. Unlike traditional file-sharing methods such as email, which often lack security controls and auditing capabilities, an MFT platform provides a secure, compliant and centralized way to transfer files. This is particularly important for ad hoc sharing, where users need to quickly share files within or outside their organizations’ network walls without compromising data integrity or exposing sensitive information to unauthorized access. MFT platforms offer end-to-end encryption, detailed audit trails and strict access controls, ensuring that even spontaneous and one-off file transfers meet corporate governance and compliance requirements.
Keeping these platforms secure is no small task. Regular pentesting, Static and dynamic code testing and third-party security audits are no longer optional; they’re table stakes. Unfortunately, these practices are non-standard in the industry. Solutions such as JSCAPE by Redwood and Cerberus by Redwood regularly publish results from these tests.
Alongside quality assurance programs, certifications like ISO 27001 and SOC1/SOC2 should be treated as table stakes as well. Platforms like JSCAPE are among the few solutions that are Drummond Certified for AS2 interoperability, a prominent B2B messaging standard that, according to Drummond Group, “safeguards critical business information that represents billions of dollars each year.”
Compliance is equally non-negotiable, with enterprises depending on MFT providers to navigate and maintain compliance in a complex web of regulations like GDPR, CCPA, HIPAA, PCI DSS and SOX, violations of which can threaten a business’s viability and executive tenure.
With significant changes looming in data movement standards across healthcare, financial services and other sectors, staying ahead of the curve in secure file transfer is not just about safeguarding data — it’s about future-proofing your entire business operation against a fast-growing fabric of data regulations.
Reliability: A multi–billion-dollar topic
Enterprise system reliability is paramount to a business’s operations, and CrowdStrike’s recent update, which caused millions of IT systems to fail, was a stark reminder of the duty enterprise software vendors have to their customers.
According to Axios, one in four Fortune 500 companies experienced a service disruption and likely lost a combined $5.4 billion. This makes it abundantly clear that quality assurance (QA) isn’t just a checkbox — it’s the first line of defense against zero-day events and breaches, which put entire enterprise ecosystems at risk. Shockingly, a recent NBER study of 150,000 organizations in the United States disclosed that there is “widespread usage of server software with known vulnerabilities, with 57% of organizations using software with severe security vulnerabilities even when secure versions were available.”
While it’s easy to think that following CrowdStrike, you should “wait and see” as patch builds release from your vendors, we view this as an untenable risk for our customers in the file transfer space. The very nature of the data moving through these platforms demands vigilance. External threat actors are knocking at the door, and the risks to internal network access far outweigh any risk to production service stability you assume by upgrading (especially where rollbacks are easy). Redwood Software invests massive resources in QA and engineering for our JSCAPE platform to ensure new builds are up to par, and we strongly advise swift adaptation of new releases.
As the offices of the CISO, CFO, CTO and others responsible for enterprise file transfer continue to drive cloud and SaaS transformation initiatives, deployment flexibility has never been more critical. High availability (HA) with auto-scaling — whether on-premises or in the cloud — ensures systems stay resilient and responsive as payloads scale.
Amidst mass data sprawl and the rise of large language models (LLMs), integrating file transfer systems (MFT) with and working alongside other messaging platforms such as iPaaS creates a future-proof ecosystem for data pipelines. Doing this across SaaS, on-prem or hybrid environments requires flexibility not generally seen in legacy file transfer technologies.
Furthermore, monitoring and integration with logging tools are a must. They provide the transparency and real-time insights required to maintain compliance in a data-centric landscape. It’s imperative to track every file transfer to identify anomalies and escalate them before they become major issues. An MFT solution with audit trails and performance metrics mitigates the risk of the all-too-common, far-too-costly breach.
You don’t get these modern architectures and flexibility of deployment from vendors who have acquired decades-old file transfer brands, which sat on a shelf collecting dust. Redwood is proud to have grown our product and engineering teams, who focus on file transfer, by more than 300% over the last year, enabling us to tackle not only internal innovation across our products but to ensure that the customers we serve have a voice in an active roadmap that improves every quarter. That is unique in the industry, and it’s reflected in our JSCAPE customer reviews on G2.
Enabling accessible file transfer automation
As automation initiatives flow through organizations well beyond the halls of IT, it’s critical that MFT platforms cater to the growing number of business users who will not use the command line interface (CLI) to interact.
Drag-and-drop builders are paramount to HR being able to automate sending encrypted payroll information to an external payroll provider, for analysts to automate order intake across the supply chain or for a finance team to create a set of automated report flows to external auditors. When you democratize MFT, it can unlock freedom amongst teams to focus on the things that really matter.
That’s not to say that the CLI is not critical, especially as MFT and iPaaS work seamlessly via APIs to serve broad use cases within enterprises. However, this is no longer the only way to drive file transfer, a fact that opens up broad possibilities for new efficiencies across organizations.
Looking ahead to 2025
File transfer is rapidly changing, but a half-century’s worth of roots will keep MFT firmly planted as a critical piece of the enterprise software technology stack for decades to come.
Not all file sharing services will survive and thrive in these rapidly changing times. Vendors in this space, who notoriously feature lackluster support organizations and product roadmaps that originated in the pre-cloud era, will be left behind. Companies need vendors for MFT that they can trust to keep them secure, available and relevant in the technology ecosystems of today.
In 2025, AI will go from a thematic talking point of the industry, as it is in 2024, to a material driver of optimization across enterprise file transfer systems, helping predict and spread workload throughout peak usage and monitor pattern anomalies for real-time security alerts.
SaaS will move from innovative to table-stakes status, even in regulated enterprise environments, and vendors who do not take this seriously will be left behind.
Security threats will evolve, requiring MFT platform teams to work closely with internal and external security groups to proactively counter these threats.
Know your MFT vendor
Back in 2010, Eric Schmidt, then CEO of Google, proclaimed to an audience in beautiful Lake Tahoe that “Every two days now we create as much information as we did from the dawn of civilization up until 2003. That’s something like five exabytes of data.”
In the nearly 15 years since, the ecosystems around enterprise data have continued to grow on a relatively unimaginable scale. Cloud computing has become the standard, data warehouse platforms like Snowflake and Databricks have become household names and the proliferation of AI is making the analysis of these massive datasets more accessible than ever.
One thing that has not changed: Companies need to move massive data sets from point A to point B securely and reliably. While the ecosystem and endpoints have evolved, the DNA remains.
MFT is here to stay and must evolve with the times. We’re very proud of what we’re doing here at Redwood. To see the latest and greatest of JSCAPE and JSCAPE SaaS, book a demo.
About The Author
Max Schultz
Max Schultz is the General Manager of File Transfer at Redwood, overseeing the JSCAPE and Cerberus product lines, and also leads Redwood’s M&A initiatives.
Prior to joining Redwood, Max held senior leadership roles in sales, global customer success and regional management at Test IO, a private equity-backed leader in software quality assurance. Following Test IO's successful exit to NYSE: EPAM in 2019, Max served as General Manager through its integration and was appointed CEO in 2022. His steadfast alignment of go-to-market strategy with product execution consistently drives strong P&L performance and exceptional customer outcomes.
He holds a B.A. in Economics from the University of Southern California.