article

How to Use `curl` to Ignore SSL Certificate Errors Safely

SSL (Secure Sockets Layer) ensures the data you send and receive over HTTPS is encrypted and secure. Before establishing a secure connection, `curl` checks whether the server’s certificate is valid and trusted.
How to Use `curl` to Ignore SSL Certificate Errors Safely

When working with APIs or testing websites in development, you might encounter SSL-related errors while using curl. These errors happen when curl can’t verify a website’s SSL certificate—often due to self-signed certificates, expired certificates, or staging environments. In these cases, knowing how to make curl ignore SSL verification can be very useful.

Why SSL Verification Matters

SSL (Secure Sockets Layer) ensures the data you send and receive over HTTPS is encrypted and secure. Before establishing a secure connection, curl checks whether the server’s certificate is valid and trusted.

If the certificate is:

  • Self-signed (not issued by a trusted authority)
  • Expired
  • Incorrectly configured

Then curl will block the request and show this common error:

curl: (60) SSL certificate problem: self signed certificate

This is by design—to protect you from insecure or fake websites.

How to Ignore SSL in curl

To bypass this verification step (only when you know it’s safe), use the -k or --insecure option:

curl -k https://example.com

or

curl --insecure https://example.com

This tells curl to proceed even if the SSL certificate can’t be verified.

⚠️ Important: Only use --insecure in trusted environments like development or testing. Avoid it in production, as it disables one of the key protections of HTTPS.

Can I Always Ignore SSL by Default?

Yes, but it’s risky. If you want to make curl always ignore SSL, add this to your ~/.curlrc file:

insecure

Again, this is not recommended unless you have full control over the environment and understand the risks involved.

Ignore SSL in Code (with Curl Bindings)

If you’re using curl through code, here’s how you can ignore SSL in different languages:

PHP (with cURL)

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "https://example.com");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); // disables SSL verification
$response = curl_exec($ch);
curl_close($ch);

Python (using PycURL)

import pycurl

c = pycurl.Curl()
c.setopt(c.URL, 'https://example.com')
c.setopt(c.SSL_VERIFYPEER, 0)  # ignore SSL verification
c.setopt(c.SSL_VERIFYHOST, 0)
c.perform()
c.close()

Again: only use this for testing purposes. Ignoring SSL verification in production environments can make your app vulnerable to man-in-the-middle (MITM) attacks.

When Should You Ignore SSL?

Here are a few legitimate use cases:

  • Testing a new API server with a self-signed certificate
  • Crawling or scraping a dev/staging site without a proper certificate
  • Making internal calls in a local network where security is already managed

Conclusion

The curl -k or --insecure flag can be a handy tool—but it’s not something you should rely on in production. Always use it with caution, and only in trusted environments.

SSL certificates exist to protect users, and skipping verification weakens that layer of protection. If you’re frequently running into SSL errors during scraping or automation tasks, consider:

  • Installing the correct root certificates
  • Using a valid certificate, even in staging
  • Or working with a proxy solution that handles SSL verification for you

Get started now!

Step up your web scraping

Try MrScraper Now

Find more insights here

Streamlining Web Scraping with LunaProxy and MrScraper Integration

Streamlining Web Scraping with LunaProxy and MrScraper Integration

LunaProxy provides the robust proxy infrastructure needed for large-scale data collection. But to intelligently navigate anti-bot systems and extract accurate content, it needs powerful AI assistance.

Scaling Wikipedia Data Extraction with MrScraper and Piaproxy

Scaling Wikipedia Data Extraction with MrScraper and Piaproxy

Learn how to efficiently scrape Wikipedia at scale using MrScraper and Piaproxy. Avoid IP bans, access dynamic content, and collect clean data for research, AI training, or knowledge graphs.

Go vs Python: Performance, Concurrency, and Use Cases

Go vs Python: Performance, Concurrency, and Use Cases

Go is a statically typed, compiled language designed for speed and efficiency. While, Python, on the other hand, is an interpreted, dynamically typed language known for its simplicity and readability.

What people think about scraper icon scraper

Net in hero

The mission to make data accessible to everyone is truly inspiring. With MrScraper, data scraping and automation are now easier than ever, giving users of all skill levels the ability to access valuable data. The AI-powered no-code tool simplifies the process, allowing you to extract data without needing technical skills. Plus, the integration with APIs and Zapier makes automation smooth and efficient, from data extraction to delivery.


I'm excited to see how MrScraper will change data access, making it simpler for businesses, researchers, and developers to unlock the full potential of their data. This tool can transform how we use data, saving time and resources while providing deeper insights.

John

Adnan Sher

Product Hunt user

This tool sounds fantastic! The white glove service being offered to everyone is incredibly generous. It's great to see such customer-focused support.

Ben

Harper Perez

Product Hunt user

MrScraper is a tool that helps you collect information from websites quickly and easily. Instead of fighting annoying captchas, MrScraper does the work for you. It can grab lots of data at once, saving you time and effort.

Ali

Jayesh Gohel

Product Hunt user

Now that I've set up and tested my first scraper, I'm really impressed. It was much easier than expected, and results worked out of the box, even on sites that are tough to scrape!

Kim Moser

Kim Moser

Computer consultant

MrScraper sounds like an incredibly useful tool for anyone looking to gather data at scale without the frustration of captcha blockers. The ability to get and scrape any data you need efficiently and effectively is a game-changer.

John

Nicola Lanzillot

Product Hunt user

Support

Head over to our community where you can engage with us and our community directly.

Questions? Ask our team via live chat 24/5 or just poke us on our official Twitter or our founder. We're always happy to help.