Thomas Ryan’s Post

View profile for Thomas Ryan, graphic

Founder, Board Member, Security Advisor, Keynote Speaker

End-to-End Encryption vs. Protecting Children: Can We Have Both? A recent proposal by the European Union has ignited a fierce debate: should tech companies be mandated to scan private messages for child sexual abuse material (CSAM)? Protecting children is imperative, but this proposal raises grave concerns about the integrity of end-to-end encryption. Encryption is the cornerstone of our online privacy. It ensures that only the sender and recipient can access a message's contents. Undermining encryption would have profound consequences, leaving our communications vulnerable to malicious actors. How can we trust the government to safeguard our privacy when vulnerabilities exist in platforms like Signal, a widely trusted encrypted messaging service? Consider the following CVEs: Signal Desktop: CVE-2023-36665, CVE-2022-37601, CVE-2021-23440, CVE-2019-10747 Signal Server: CVE-2022-1471, CVE-2022-42889, CVE-2022-0839 libsignal: CVE-2023-42282 These vulnerabilities highlight the ongoing challenges in maintaining secure communication channels, making it even more critical to question any measures that could further weaken encryption. Advocates of the EU's plan assert that it's necessary to combat the proliferation of CSAM. However, critics argue that scanning private messages sets a dangerous precedent, paving the way for mass surveillance and infringing on our right to privacy. Moreover, such measures might prove futile, as criminals could simply migrate to more secure platforms. The critical question is whether we can protect children without compromising our privacy. This issue demands careful deliberation. We must devise solutions that tackle CSAM effectively without dismantling encryption's security advantages. Consider these questions: 1. Are there alternative methods to detect and prevent CSAM that do not involve scanning private messages? 2. Can education and awareness programs be enhanced to empower people to identify and report CSAM? 3. What role can tech companies play in creating solutions that protect children while upholding privacy? I would like you to please engage in this conversation and share your perspectives on this vital issue. Together, we can find a solution that protects our children and our privacy. #Signal #cybersecurity #privacy #encryption #childsafety #EU #technology What do you think? Can we strike a balance between protecting children and safeguarding privacy? Websites reviewed https://lnkd.in/e_CHeGfU

Signal Foundation Warns Against EU's Plan to Scan Private Messages for CSAM

Signal Foundation Warns Against EU's Plan to Scan Private Messages for CSAM

thehackernews.com

Derek Scheller Jr

Practice Manager - Security Engineering at Stratascale

2w

You are definitely asking the important questions today Thomas. As a father of 5 boys, I can understand the importance of doing everything in our power to prevent CSAM. As such, the ONLY compromise I would be willing to make and I'm still on the fence about even this, is if possible make it so ONLY images are unencrypted. Now is this still a stretch and a lack of privacy yes. However, in order to maintain the encryption and the privacy of the messaging which was why Signal came about I would be willing to at least debate the ethics of images being decrypted and scanned and would it still secure the text that we send.

To view or add a comment, sign in

Explore topics