MoonPoint Support Logo

 

Shop Amazon Warehouse Deals - Deep Discounts on Open-box and Used ProductsAmazon Warehouse Deals



Advanced Search
November
Sun Mon Tue Wed Thu Fri Sat
         
23
24 25 26 27 28 29 30
2024
Months
NovDec


Wed, May 06, 2015 9:29 pm

Curl SSL certificate problem

When attempting to download a file via HTTPS from a website using curl, I saw the error message "SSL3_GET_SERVER_CERTIFICATE:certificate verify failed".

$ curl -o whitelist.txt https://example.com/BLUECOAT/whitelist.txt
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0c
url: (60) SSL certificate problem, verify that the CA cert is OK. Details:
error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify fail
ed
More details here: http://curl.haxx.se/docs/sslcerts.html

curl performs SSL certificate verification by default, using a "bundle"
 of Certificate Authority (CA) public keys (CA certs). If the default
 bundle file isn't adequate, you can specify an alternate file
 using the --cacert option.
If this HTTPS server uses a certificate signed by a CA represented in
 the bundle, the certificate verification probably failed due to a
 problem with the certificate (it might be expired, or the name might
 not match the domain name in the URL).
If you'd like to turn off curl's verification of the certificate, use
 the -k (or --insecure) option.

When I added the -k option, I was able to download the file successfully.

$ curl -o whitelist.txt -k https://example.com/BLUECOAT/whitelist.txt

But I wanted to know what the issue was with the public key certificate and I wanted to get that information from a Bash shell prompt. You can get the certificate from a website using the command openssl s_client -showcerts -connect fqdn:443, where fqdn is the fully qualified domain name for the website, e.g. example.com. Port 443 is the standard port used for HTTPS. The certificate should be stored as a .pem file. When I used openssl s_client -showcerts -connect example.com:443 >example.pem, I saw the message "verify error:num=19:self signed certificate in certificate chain" displayed, which revealed the source of the problem.

A self-signed certificate is one that has been signed by the same entity whose identity it certifies. For a site using a self-signed certificate, your traffic to and from that site is protected from eavesdroppers along the path of the traffic, but the certificate doesn't offer validation that the site belongs to the entity claiming to own it. But, if you have other reasons to trust the site or are only concerned about third parties eavesdropping on your communications with the site, then a self-signed certificate may be adequate. E.g., the site could be your own site or belong to someone or an entity you know is in control of the website. Some organizations use self-signed certificates for internal sites with the expectation that members/employees will ignore browser warnings for the internal websites, though if people become accustomed to ignoring such errors there is the danger that they will also be more prone to ignore such warnings for external sites where a site's true controlling entity isn't the one they expect.

$ openssl s_client -showcerts -connect example.com:443 >example.pem
depth=1 /C=US/ST=Maryland/L=Greenbelt/O=ACME/OU=EXAMPLE/CN=EXAMPLE CA
verify error:num=19:self signed certificate in certificate chain
verify return:0
read:errno=0

The s_client parameter uses a generic SSL/TLS client to establish the connection to the server.


       s_client  This implements a generic SSL/TLS client which can establish
                 a transparent connection to a remote server speaking SSL/TLS.
                 It's intended for testing purposes only and provides only
                 rudimentary interface functionality but internally uses
                 mostly all functionality of the OpenSSL ssl library.

The certificate is stored in example.pem in this case. You would need to edit the file to remove everything but the "BEGIN CERTIFICATE" and "END CERTIFICATE" lines below and the lines that lie between those two lines.

-----BEGIN CERTIFICATE-----
-----END CERTIFICATE-----

Or you can use a Bash script retrieve_certifcate to obtain the certificate; it will stip off the extraneous lines. The code for the script is shown below:

#!/bin/sh
#
# usage: retrieve-cert.sh remote.host.name [port]
#
REMHOST=$1
REMPORT=${2:-443}

echo |\
openssl s_client -connect ${REMHOST}:${REMPORT} 2>&1 |\
sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p'

You can obtain information for the certificate from the PEM file using the command openssl x509 -text -in example.pem. If -issuer is appended, then only the issuer information will be displayed, so I could see that the cerificate was self-signed with the following command:

$ openssl x509 -noout -in example.pem -issuer
issuer= /C=US/ST=Maryland/L=Greenbelt/O=ACME/OU=EXAMPLE/CN=EXAMPLE CA

If you just want to verify the status of a certificate from the command line without storing the certificate locally, you can add the -verify 0 option.


       -verify depth - turn on peer certificate verification

E.g.:

$ openssl s_client -showcerts -verify 0 -connect example.com:443 
verify depth is 0
CONNECTED(00000003)
depth=1 /C=US/ST=Maryland/L=Greenbelt/O=ACME/OU=EXAMPLE/CN=EXAMPLE CA
verify error:num=19:self signed certificate in certificate chain
verify return:0
88361:error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed:/SourceCache/OpenSSL098/OpenSSL098-52.6.1/src/ssl/s3_clnt.c:998:

You can ignore all output from the command but the "verify error" line with commands like the following:

$ openssl s_client -showcerts -verify 0 -connect example.com:443 2>&1 | grep "verify error"
verify error:num=19:self signed certificate in certificate chain

For another internal website, when I accessed the site in Firefox with https://cmportal, Firefox reported the following:

This Connection Is Untrusted

You have asked Firefox to connect securely to cmportal, but we can't confirm that your connection is secure.

Normally, when you try to connect securely, sites will present trusted identification to prove that you are going to the right place. However, this site's identity can't be verified.

What Should I Do?

If you usually connect to this site without problems, this error could mean that someone is trying to impersonate the site, and you shouldn't continue.

When I viewed the technical details for the certificate, Firefox informed me that:

code760cmportal uses an invalid security certificate. The certificate is only valid for the following names: 192.168.160.242, servera.example.com (Error code: ssl_error_bad_cert_domain)

When I tried downloading the home page for the site with curl, I saw the message below:

$ curl https://cmportal
curl: (51) SSL peer certificate or SSH remote key was not OK

I was able to get past that error with the -k or --insecure parameter to curl, though then the page returned reported I was being denied access to the requested web page due to invalid credentials.

I downloaded the certificate for that site with openssl; since openssl would wait for input after verify return:0, I used an echo "" | to get it to complete.

$ echo "" | openssl s_client -showcerts -connect cmportal:443 >example.pem
depth=2 /C=US/O=Acme/OU=Anvils/OU=Certification Authorities/OU=Anvils Root CA
verify error:num=20:unable to get local issuer certificate
verify return:0
DONE

I removed all the lines before "BEGIN CERTIFICATE" and all those after "END CERTIFICATE" and then checked the certificate for that .pem file with the openssl command. That showed me a reference to servera whereas I had accessed the site using cmportal..

$ openssl x509 -noout -in example.pem -subject
subject= /C=US/O=Acme/OU=Anvils/OU=Services/CN=servera.example.com

If you've accepted a self-signed certificate, or a certificate with other issues, in Firefox, you can view the certificate following the steps noted in Forgetting a certificate in Firefox.

References:

  1. Retrieving Password Protected Webpages Using HTTPS With Curl
    Date: September 8, 2011
    MoonPoint Support
  2. How To Verify SSL Certificate From A Shell Prompt
    Date: May 23, 2009
    nixCraft
  3. Example sites with broken security certs [closed]
    Asked: November 9, 2009
    Stack Overflow
  4. Command line tool for fetching and analyzing SSL certificate
    Asked: April 17, 2014
    Server Fault
  5. OpenSSL Command-Line HOWTO"
    Published: June 13, 2004
    Most recent revision: June 25, 2014
    By: Paul Heinlein
    madboa.com
  6. x509 - Certificate display and signing utility
    OpenSSL: The Open Source toolkit for SSL/TLS

[/network/web/tools/curl] permanent link

Sat, Sep 10, 2011 4:19 pm

Submitting a form with POST using cURL

I needed to submit a form on a webpage using cURL. The form submission was using POST rather than GET. You can tell which method is being used by examining the source code for a page containing a form. If POST is being used, you will see it listed as the form method in the form tag as shown in the example below. A form that uses GET, instead, would have "get" as the form method.

<form method=post action=https://example.com/cgi-bin/SortDetail.pl>

You can use the -d or --data option with cURL to use POST for a form submission.


-d/--data <data>
              (HTTP) Sends the specified data in a POST request  to  the  HTTP
              server,  in  the  same  way  that a browser does when a user has
              filled in an HTML form and presses the submit button. This  will
              cause curl to pass the data to the server using the content-type
              application/x-www-form-urlencoded.  Compare to -F/--form.

              -d/--data is the same  as  --data-ascii.  To  post  data  purely
              binary, you should instead use the --data-binary option. To URL-
              encode the value of a form field you may use --data-urlencode.

              If any of these options is used more than once on the same  com-
              mand  line,  the  data  pieces specified will be merged together
              with a separating  &-symbol.  Thus,  using  '-d  name=daniel  -d
              skill=lousy'  would  generate  a  post  chunk  that  looks  like
              'name=daniel&skill=lousy'.

              If you start the data with the letter @, the rest  should  be  a
              file  name  to read the data from, or - if you want curl to read
              the data from stdin.  The contents of the file must  already  be
              URL-encoded.  Multiple files can also be specified. Posting data
              from a file named 'foobar' would thus be done with --data  @foo-
              bar.

To submit the form using cURL, I used the following:

$ curl -u jsmith:SomePassword -d "Num=&Table=All&FY=&IP=&Project=&Service=&portNo=&result=request&display_number=Find+Requests" -o all.html https://example.com/cgi-bin/SortDetail.pl

In this case the website was password protected, so I had to use the -u option to submit a userid and password in the form -u userid:password. If you omit the :password and just use -u userid, then cURL will prompt you for the password. So, if you want to store the cURL command in a script, such as a Bash script, but don't want to store the password in the script, you can simply omit the :password.

The -d option provides the parameters required by the form and the values for those parameters, which were as follows in this case:

ParameterValue
Num 
TableAll
FY 
IP 
Project 
Service 
portNo 
resultrequest
display_numberFind+Requests

The format for submitting values for parameters is parameter=value. Parameters are separated by an ampersand, &.

URLs can only be sent over the Internet using the ASCII character-set. Special non ASCII characters, which include the space character must be represented with a % followed by two hexadecimal digits. The space character can be represented by + or by %20. So, though the value for "display_number" is "Find Requests", it needs to be sent as Find+Requests or Find%20 requests. You can see a list of other characters that should be encoded at URL Encoding.

In this case, I didn't need to specify values for many parameters in the form, because I wanted the query to cover all potential values for those parameters. So I can just use parameter= and then follow that with an & to specify I am submitting the next parameter in the list.

References:

  1. cURL - Tutorial
    cURL and libcurl
  2. curl Examples
    Linux Journal | Linux Tips
  3. POST (HTTP)
    Wikipedia, the free encyclopedia
  4. The POST Method
    James Marshall's Home Page
  5. How to submit a form using PHP
    HTML Form Guide - All about web forms!
  6. HTML URL Encoding
    W3Schools Online Web Tutorials
  7. URL Encoding
    Bloo's Home Page

[/network/web/tools/curl] permanent link

Thu, Sep 08, 2011 9:25 pm

Retrieving Password Protected Webpages Using HTTPS With Curl

Mac OS X systems come with the curl command line tool which provides the capability to retrieve web pages from a shell prompt. To use the tool, using Finder on the system, you can go to Applications, Utilities and double-click on Terminal to obtain a shell prompt.

Curl is also available for a variety of other operating systems, including DOS, Linux, and Windows. Versions for other operating systems can be obtained from cURL - Download. If you will be retrieving encrypted webpages using the HTTPs protocol, be sure to get the binary version that includes Secure Sockets Layer (SSL) support.

A program with similar functionality is Wget, but that isn't included by default with the current versions of the Mac OS X operating system.

On Mac OS X systems, curl is available in /usr/bin and help on the options for curl can be found using man curl, curl -h , curl --help, and curl --manual. An online manual can be viewed at cURL - Manual.

To retrieve a webpage that requires a userid and password for access with curl using the HTTPS protocol, you can use a command similar to the one below where userid and password represent the userid and password required to access that particular webpage.

curl -u userid:password https://example.com/somepage.html

If you don't want to include the password on the command line, you can just specify the userid after the -u; curl will then prompt you for the password.

$ curl -u jsmith https://example.com/somepage.html
Enter host password for user 'jsmith':

If you wish to save the output in a file rather than have it go to stdout, i.e., rather than have it appear on the screen, you can use the -o/--output filename option where filename is the name you wish to use for the output file. Curl will provide information on the number of bytes downloaded and the time that it took to download a webpage.

$ curl -u jsmith:somepassword -o somepage.html https://example.com/somepage.html
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 22924    0 22924    0     0  16308      0 --:--:--  0:00:01 --:--:-- 26379

References:

  1. cURL and libcurl

[/network/web/tools/curl] permanent link

Valid HTML 4.01 Transitional

Privacy Policy   Contact

Blosxom logo