Once again about video surveillance, cameras, RTSP, onvif. And the "bike"!

    The information was already on the habr: habrahabr.ru/post/115808 and habrahabr.ru/post/117735
    Motion-JPEG (MJPEG) is described there.
    The world does not stand still and video surveillance too. More and more other codecs are being used.
    Here I describe my experience in this "world".
    Professionals will not learn anything new, others may just be interested.
    Everything was developed as a training and training.
    We will talk about RTP, RTSP, h264, mjpeg, onvif and all together.
    Before reading, be sure to read the articles of another author, indicated above.

    What is RTSP can be read:

    The peculiarity of RTSP is that it by itself does not transmit the video data we need. After the connection is established, all work is done using the RTP protocol ( RFC ).

    RTP protocol needs to distinguish between 2 types of transmission
    1. Non-Interleaved Mode (UDP)
    2. Interleaved Mode (TCP)


    Non-Interleaved Mode.
    RTSP establishes communication and transmits to the camera information about where to send data (UDP ports).
    RTSP Communication Example

    //INFO: connect to: rtsp://10.112.28.231:554/live1.sdp
    OPTIONS rtsp://10.112.28.231:554/live1.sdp RTSP/1.0
    CSeq: 1
    User-Agent: LibVLC/2.1.4 (LIVE555 Streaming Media v2014.01.21)
    RTSP/1.0 200 OK
    CSeq: 1
    Date: Tue, Jan 15 2013 02:02:56 GMT
    Public: OPTIONS, DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, GET_PARAMETER, SET_PARAMETER
    DESCRIBE rtsp://10.112.28.231:554/live1.sdp RTSP/1.0
    CSeq: 2
    User-Agent: LibVLC/2.1.4 (LIVE555 Streaming Media v2014.01.21)
    Accept: application/sdp
    RTSP/1.0 200 OK
    CSeq: 2
    Date: Tue, Jan 15 2013 02:02:56 GMT
    Content-Base: rtsp://10.112.28.231/live1.sdp/
    Content-Type: application/sdp
    Content-Length: 667
    //667 - Размер SDP пакета, о нем позже
    SETUP rtsp://10.112.28.231:554/live1.sdp/track1 RTSP/1.0
    CSeq: 3
    User-Agent: LibVLC/2.1.4 (LIVE555 Streaming Media v2014.01.21)
    Transport: RTP/AVP;unicast;client_port=49501-49502
    RTSP/1.0 200 OK
    CSeq: 3
    Date: Tue, Jan 15 2013 02:02:56 GMT
    Transport: RTP/AVP;unicast;destination=10.112.28.33;source=10.112.28.231;client_port=49501-49502;server_port=6970-6971
    Session: 7BFE9DAA
    SETUP rtsp://10.112.28.231:554/live1.sdp/track2 RTSP/1.0
    CSeq: 4
    User-Agent: LibVLC/2.1.4 (LIVE555 Streaming Media v2014.01.21)
    Transport: RTP/AVP;unicast;client_port=49503-49504
    Session: 7BFE9DAA
    RTSP/1.0 200 OK
    CSeq: 4
    Date: Tue, Jan 15 2013 02:02:56 GMT
    Transport: RTP/AVP;unicast;destination=10.112.28.33;source=10.112.28.231;client_port=49503-49504;server_port=6972-6973
    Session: 7BFE9DAA
    PLAY rtsp://10.112.28.231:554/live1.sdp RTSP/1.0
    CSeq: 5
    User-Agent: LibVLC/2.1.4 (LIVE555 Streaming Media v2014.01.21)
    Session: 7BFE9DAA
    Range: npt=0.000-
    RTSP/1.0 200 OK
    CSeq: 5
    Date: Tue, Jan 15 2013 02:02:56 GMT
    Range: npt=0.000-
    Session: 7BFE9DAA
    RTP-Info: url=rtsp://10.112.28.231/live1.sdp/track1;seq=7746;rtptime=0,url=rtsp://10.112.28.231/live1.sdp/track2;seq=13715;rtptime=0
    


    Remember
    Transport: RTP/AVP;unicast;destination=10.112.28.33;source=10.112.28.231;client_port=49501-49502;server_port=6970-6971

    Interleaved Mode.
    The difference with Non-Interleaved Mode is that all packets will be streamed to the same port.
    Example:

    OPTIONS rtsp://10.113.151.152:554/tcp_live/profile_token_0 RTSP/1.0
    CSeq: 1
    User-Agent: LibVLC/2.1.4 (LIVE555 Streaming Media v2014.01.21)
    RTSP/1.0 200 OK
    CSeq: 1
    User-Agent: LibVLC/2.1.4 (LIVE555 Streaming Media v2014.01.21)
    Public: OPTIONS, DESCRIBE, SETUP, PLAY, TEARDOWN, SET_PARAMETER
    DESCRIBE rtsp://10.113.151.152:554/tcp_live/profile_token_0 RTSP/1.0
    CSeq: 2
    User-Agent: LibVLC/2.1.4 (LIVE555 Streaming Media v2014.01.21)
    Accept: application/sdp
    RTSP/1.0 200 OK
    CSeq: 2
    Content-Type: application/sdp
    Content-Length: 316
    SETUP rtsp://10.113.151.152:554/tcp_live/profile_token_0/video/h264 RTSP/1.0
    CSeq: 3
    User-Agent: LibVLC/2.1.4 (LIVE555 Streaming Media v2014.01.21)
    Transport: RTP/AVP/TCP;unicast;interleaved=0-1
    RTSP/1.0 200 OK
    CSeq: 3
    Session: 52cd95de
    Transport: RTP/AVP/TCP;interleaved=0-1;unicast
    SETUP rtsp://10.113.151.152:554/tcp_live/profile_token_0/audio/pcma RTSP/1.0
    CSeq: 4
    User-Agent: LibVLC/2.1.4 (LIVE555 Streaming Media v2014.01.21)
    Transport: RTP/AVP/TCP;unicast;interleaved=2-3
    Session: 52cd95de
    RTSP/1.0 200 OK
    CSeq: 4
    Session: 52cd95de
    Transport: RTP/AVP/TCP;interleaved=2-3;unicast
    PLAY rtsp://10.113.151.152:554/tcp_live/profile_token_0 RTSP/1.0
    CSeq: 5
    User-Agent: LibVLC/2.1.4 (LIVE555 Streaming Media v2014.01.21)
    Session: 52cd95de
    Range: npt=0.000-
    RTSP/1.0 200 OK
    CSeq: 5
    Session: 52cd95de
    


    We remember
    Transport: RTP/AVP/TCP;unicast;interleaved=0-1

    Now we look at what and how.
    Cameras send video and audio to different RTP streams. 2n stream - data, 2n + 1 stream - RTCP.
    On the video we go to channel 0 and 1, to audio channel 2 and 3.
    Now look. In the first case, the ports are indicated, in the second case the channels. With Non-Interleaved Mode, everything is clear. It’s just that RTP packets are poured into ports and can be read something like this: Problems begin with Interleaved mode. In fact, there should not be any problems. By RFC we are looking for magic char "$", the next byte is the channel (it is indicated in the connection 0-4 with us) and 2 bytes of Length. Only 4 bytes. But there are not normal cameras. For example, D-ling DCS-2103 "Fills up" some data after an rtp packet. frame gives a size of 1448,
    Transport: RTP/AVP;unicast;destination=10.112.28.33;source=10.112.28.231;client_port=49501-49502;server_port=6970-6971
    Transport: RTP/AVP/TCP;unicast;interleaved=0-1




    DatagramPacket packet = new DatagramPacket(buffer, buffer.length);
    s.receive(packet);





    sends 1448 frames, and after 827 bytes of some kind of garbage. (This is what Dlink DCS-2103 firmware 1.00 and 1.20

    does ) And this happens constantly for them. This is often the case with Chinese cameras. Qihan (356) did not suffer from this.
    Except how to skip this garbage there are no more ideas.
    RTP pours useful data. When DESCRIBE RTSP, an SDP packet is returned.
    Examples of SDP (h264, mjpeg, mpeg4):
    v=0
    o=- 1357245962093293 1 IN IP4 10.112.28.231
    s=RTSP/RTP stream 1 from DCS-2103
    i=live1.sdp with v2.0
    t=0 0
    a=type:broadcast
    a=control:*
    a=range:npt=0-
    a=x-qt-text-nam:RTSP/RTP stream 1 from DCS-2103
    a=x-qt-text-inf:live1.sdp
    m=video 0 RTP/AVP 96
    c=IN IP4 0.0.0.0
    b=AS:1500
    a=rtpmap:96 H264/90000
    a=fmtp:96 packetization-mode=1;profile-level-id=640028;sprop-parameter-sets=Z2QAKK2EBUViuKxUdCAqKxXFYqOhAVFYrisVHQgKisVxWKjoQFRWK4rFR0ICorFcVio6ECSFITk8nyfk/k/J8nm5s00IEkKQnJ5Pk/J/J+T5PNzZprQCgDLSpAAAAwHgAAAu4YEAAPQkAABEqjve+F4RCNQ=,aO48sA==
    a=control:track1
    m=audio 0 RTP/AVP 97
    c=IN IP4 0.0.0.0
    b=AS:64
    a=rtpmap:97 G726-32/8000
    a=control:track2
    v=0
    o=- 1357245962095633 1 IN IP4 10.112.28.231
    s=RTSP/RTP stream 3 from DCS-2103
    i=live3.sdp with v2.0
    t=0 0
    a=type:broadcast
    a=control:*
    a=range:npt=0-
    a=x-qt-text-nam:RTSP/RTP stream 3 from DCS-2103
    a=x-qt-text-inf:live3.sdp
    m=video 0 RTP/AVP 26
    c=IN IP4 0.0.0.0
    b=AS:1500
    a=x-dimensions:640,360
    a=control:track1
    m=audio 0 RTP/AVP 97
    c=IN IP4 0.0.0.0
    b=AS:64
    a=rtpmap:97 G726-32/8000
    a=control:track2
    v=0
    o=- 1357245962094966 1 IN IP4 10.112.28.231
    s=RTSP/RTP stream 2 from DCS-2103
    i=live2.sdp with v2.0
    t=0 0
    a=type:broadcast
    a=control:*
    a=range:npt=0-
    a=x-qt-text-nam:RTSP/RTP stream 2 from DCS-2103
    a=x-qt-text-inf:live2.sdp
    m=video 0 RTP/AVP 96
    c=IN IP4 0.0.0.0
    b=AS:1500
    a=rtpmap:96 MP4V-ES/90000
    a=fmtp:96 profile-level-id=1;config=000001B001000001B509000001010000012000845D4C29402320A21F
    a=control:track1
    m=audio 0 RTP/AVP 97
    c=IN IP4 0.0.0.0
    b=AS:64
    a=rtpmap:97 G726-32/8000
    a=control:track2
    


    Read about SDP
    Since the mod was mjpeg and current on h264, then we will consider them.
    With MJpeg, everything is extremely clear. But with H264, the differences in the cameras begin.
    The h264 format consists of blocks with NAL headers ( 7.4.1 NAL unit semantics ).
    In order to be able to decode h264, in addition to the data of h264 itself, it is necessary to have SPS (Sequence parameter set) and PPS (Picture parameter set) data. The first describes the sequence, the second picture parameters. Since the h264 codec itself is very poorly known, there will be no more description. SPS is type 7, PPS 8. Without them, it is impossible to decode h264.
    The most interesting is that Qihan sends SPS and PPS directly in RTP packets, Dlink does not send them in RTP packets. But SPS and PPS are sent in the SDP packet in the sprop-parameter-sets parameter in base64 encoding.
    sprop-parameter-sets=Z2QAKK2EBUViuKxUdCAqKxXFYqOhAVFYrisVHQgKisVxWKjoQFRWK4rFR0ICorFcVio6ECSFITk8nyfk/k/J8nm5s00IEkKQnJ5Pk/J/J+T5PNzZprQCgDLSpAAAAwHgAAAu4YEAAPQkAABEqjve+F4RCNQ=,aO48sA==
    They are sent separated by commas
    . Decoding option.
    //split по ','
    sps = Base64.decode(props[0].getBytes());
    pps = Base64.decode(props[1].getBytes());
    


    Since the cameras are 720p or 1080p, neither a jpeg frame nor a h264 frame will fit in 1 RTP packet, they are cut into packets.
    RTP Payload Format for JPEG-compressed Video
    RTP Payload Format for H.264 Video

    JPEG
    RTP packet contains main JPEG header
        0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
       +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
       | Type-specific |              Fragment Offset                  |
       +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
       |      Type     |       Q       |     Width     |     Height    |
       +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
    

    and then it can vary from Type and Q
    if(getType() < 64){
                return JPEG_HEADER_SIZE;
            } else if(getType() < 128){
                //we have 3.1.7.  Restart Marker header
                return JPEG_HEADER_SIZE + JPEG_RESTART_MARKER_HEADER_SIZE;
            }
    

    To decode jpeg, you need to know or calculate quantization tables.
    In my cameras, the quantization tables were in the Jpeg starter pack, so they were just taken from there.
    All calculations are in the RFC.
    The last frame packet is calculated by the RTP header Marker bit. If it is 1, then this is the last packet of the frame.

    H264
    NAL Header
          +---------------+
          |0|1|2|3|4|5|6|7|
          +-+-+-+-+-+-+-+-+
          |F|NRI|  Type   |
          +---------------+
    


    Single NAL Unit Packet
    These are just SPS and PPS. Type = 7 or Type = 8
         0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
        +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
        |F|NRI|  Type   |                                               |
        +-+-+-+-+-+-+-+-+                                               |
        |                                                               |
        |               Bytes 2..n of a single NAL unit                 |
        |                                                               |
        |                               +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
        |                               :...OPTIONAL RTP padding        |
        +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
    


    If the h264 frame does not fit into the RTP packet (1448 bytes), then the frame is cut into fragments. (5.8. Fragmentation Units (FUs))
    Type = 28
         0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
        +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
        | FU indicator  |   FU header   |                               |
        +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+                               |
        |                                                               |
        |                         FU payload                            |
        |                                                               |
        |                               +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
        |                               :...OPTIONAL RTP padding        |
        +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
    

    These headers follow immediately after the RTP header.
    public int getH264PayloadStart() {
            switch(getNAL().getType()){
                case NAL.FU_A:
                    return rtp.getPayloadStart() + 2;
                case NAL.SPS:
                case NAL.PPS:
                    return rtp.getPayloadStart();
                default:
                    throw new NotImplementedException("NAL type " + getNAL().getType() + " not implemented");
            }
        }
    


    For h264 NAL decoder - the necessary information. If frame fragmentation is underway, then NAL needs to be restored. (FU)
    you need to take the first 3 bits from the FU indicator and merge them with the last 5 FU header.

    Now the most important thing is to save the stream.
    Jpeg
    public void writeRawJPEGtoStream(OutputStream out) throws IOException {
            //if(isMustBeZero()){
            if(isStart()){
                //first
                //System.out.println("first");
                byte[] headers = new byte[1024];
                int length = makeJpeg(headers);
                out.write(headers, 0, length);
                out.write(rtp.getBuffer(), getJPEGPayloadStart(), getJPEGPayloadLength());
            }else
            //if(getMarker()){
            if(isEnd()){
                //end
                //System.out.println("end");
                out.write(rtp.getBuffer(), getJPEGPayloadStart(), getJPEGPayloadLength());
                //EOI
            } else {
              //middle
                //System.out.println("middle");
                out.write(rtp.getBuffer(), getJPEGPayloadStart(), getJPEGPayloadLength());
            }
        }
    

    h264
    public static final byte[] NON_IDR_PICTURE = {0x00, 0x00, 0x00, 0x01};
    public void writeRawH264toStream(OutputStream out) throws IOException, NotImplementedException {
            switch (nal.getType()){
                case NAL.FU_A:    //FU-A, 5.8.  Fragmentation Units (FUs)/rfc6184
                    FUHeader fu = getFUHeader();
                    if(fu.isFirst()){
                        //if(debug) System.out.println("first");
                        out.write(H264RTP.NON_IDR_PICTURE);
                        out.write(getReconstructedNal());
                        out.write(rtp.getBuffer(), getH264PayloadStart(), getH264PayloadLength());
                    } else if(fu.isEnd()){
                        //if(debug) System.out.println("end");
                        out.write(rtp.getBuffer(), getH264PayloadStart(), getH264PayloadLength());
                    } else{
                        //if(debug) System.out.println("middle");
                        out.write(rtp.getBuffer(), getH264PayloadStart(), getH264PayloadLength());
                    }
                    break;
                case NAL.SPS: //Sequence parameter set
                case NAL.PPS: //Picture parameter set
                    //System.out.println("sps or pps write");
                    out.write(H264RTP.NON_IDR_PICTURE);
                    out.write(rtp.getBuffer(), rtp.getPayloadStart(), rtp.getPayloadLength());
                    break;
                default:
                    throw new NotImplementedException("NAL type " + getNAL().getType() + " not implemented");
            }
        }
    

    NON_IDR_PICTURE - is necessary for decoding, "we divide" frames. ( h264 )
    Here I need to be corrected, since this is just a “crutch” and there are no justifications yet. It just works.
    It turns out the following stream: 00000001 + SPS + 00000001 + PPS + 00000001 + NAL ...
    erlyvideo: 0,0,0,1 - this is the AnnexB prefix of the H264 record. This is not part of the H264 NAL unit, but a separator between units.

    Well, the processing of "all" of this
    while(!stop){
                    IRaw raw = rtp;
                    //читаем фрейм
                    try {
                        while(!frame.fill(in));
                        //полюбому читаем rtp пакет
                        rtp.fill(in, frame.getLength());
                        try {
                            raw = rtp.getByPayload();
                        } catch (NotImplementedException e) {
                            if(log.isLoggable(Level.FINE)) log.fine("rtp seq=" + rtp.getSequence() + ": " + e.getMessage());
                        }
                    } catch (SocketException e) {
                        log.warning(e.getMessage()); //socket closed?
                        break;
                    }
                    byte ch = frame.getChannel();
                    //RTCP? //прошивка D-link DCS2103 1.00 слала RTCP и interleaved
                    Source s = sources.get(source(ch));
                    if(rtp.getPayloadType() == RTPWrapper.TYPE_RTCP){
                        byte[] rb = new byte[frame.getLength()];
                        System.arraycopy(buffer, 0, rb, 0, rb.length);
                        s.lastRTCP = new RTCP(rb, rb.length);    //save last rtsp
                        s.lastRTCPTime = System.currentTimeMillis();
                        System.out.println(frame.getLength());
                    } else {
                        s.calculate(rtp); //вычисление для source параметров (для нужд RTCP)
                    }
                    if(os.length <= ch){
                        log.warning("Нужно больше out стримов: " + ch);
                        continue;
                    }
                    profiler.stop();
                    counter.count(profiler.getLast(), frame.getLength() / 1000.0);
                    //profiler.print(0);
                    if(os[ch] == null) continue;
                    //Нужна была синхронизация, так как os[ch] менялся, сейчас он постоянно rotator
                    synchronized (os[ch]){
                        raw.writeRawToStream(os[ch]);
                    }
                }
    

    in 2 words. We get the RTSP Interleaved Frame (for example, Channel: 0x00, 1448 bytes), read 1448 bytes, do writeRawToStream, polymorphism does the trick.

    Then it needs to be run in.
    It would seem that to maintain the RTSP flow, you need to do RTCP reports, but no, everything turned out to be simpler
    Dlink, Qihan, VLC just “eat” GET_PARAMETER:
    GET_PARAMETER rtsp://10.112.28.231:554/live3.sdp RTSP/1.0
    CSeq: 7
    User-Agent: LibVLC/2.1.4 (LIVE555 Streaming Media v2014.01.21)
    Session: 327B23C6
    

    helmet it once every 55 seconds and that’s it.

    Now the bike itself
    Just a program in which you can add a link to the camera (http or rtsp) and it will save the stream. SQLite base. "Normalization" of a stream through ffmpeg, viewing through Vlc.
    No reconnection after any disconnection, file problems, etc.
    There are no half checks and similar pieces.
    What
    buttons look like
    1. Add
    2. Delete
    3. Run
    4. Stop
    5. Archive
    6. Customization
    7. Exit

    1

    Settings :)
    2

    Archive
    1. View - Launches Vlc
    2. Glue and see - glues files and launches Vlc
    3. Exit

    3

    For simple viewing, an m3u file is generated and fed to VLC
    4.

    When gluing ffmpeg glues, then VLC
    5 starts.

    The program cuts the stream into files, the interval is set in the settings.

    What does ffmpeg:
    Glue
    String command = String.format("%s -y -f concat -i concat.txt -codec copy concat.mp4",
    

    “Normalizes” (calculates headings, etc.)
    String command = String.format("%s -i %s -codec copy %s",
                        settings.getFfmpegPath(),
                        settings.getFullTmpPath() + archive,
                        settings.getArchivePath() + "/" + settings.getRecPath() + "/" + archive + ".mp4")
    


    There are a lot of files at the output.
    6

    In a good way, you can write to any OutputStream
    Git hub
    There may not be a further program life. Perhaps I will add sometime RTP classes for sound. (since I am still fond of SIP)

    Well, the most delicious.
    There is ONVIF video surveillance standard.
    There are professional pieces of iron that work with cameras only on it.
    There are cameras that work on it (Qihan, aka Proline), and rtsp links have to google.
    There is an open source product Onvif device manager for managing such pieces of iron.
    I added onvif support to the program without authorization and with authorization.
    7
    Git hub

    In 2 words about Onvif: This is soap.
    The work is simple. 1. Helmet POST-XML, 2.
    Get the XML code on the github. The -s switch saves all XML requests and responses.
    request example:
    All

    If you follow the links above, you can get all the documentation on Onvif.
    Answer:
    http://10.112.28.231:80/onvif/device_servicefalsefalsefalsefalsefalsetruefalsefalsefalsetrue12truefalsefalsefalsefalsefalsefalsefalsehttp://10.112.28.231:80/onvif/device_servicefalsetruefalsehttp://10.112.28.231:80/onvif/device_servicehttp://10.112.28.231:80/onvif/device_servicefalsetruetrue

    Further communication on onvif without authorization is in the same vein.

    And here is an example of communication but with authorization
    adminKSsJz8Lx0xPJd4pYdMuFblluNac=Y2FsY09udmlm2013-01-15T08:00:57.000Z

    Those. need to send a headline. (tested on D-link DCS-2103, other cameras worked without authorization, China).

    Timestamp (Created)
    public static String getOnvifTimeStamp(DateTime dateTime){
            return String.format("%4d-%02d-%02dT%02d:%02d:%02d.000Z",
                    dateTime.getDate().getYear(),
                    dateTime.getDate().getMonth(),
                    dateTime.getDate().getDay(),
                    dateTime.getTime().getHour(),
                    dateTime.getTime().getMinute(),
                    dateTime.getTime().getSecond()
            );
        }
    

    Nonce
    public String getNonceDigest(){
            return base64(getNonce().getBytes());
        }
    

    and password (Password_Digest = Base64 (SHA-1 (nonce + created + password)))
    public String getPasswordDigest(){
            //Password_Digest = Base64 ( SHA-1 ( nonce + created + password ) )
            String line = getNonce() + timestamp + password;
            try {
                line = base64(sha1(line.getBytes()));
                return line;
            } catch (NoSuchAlgorithmException e) {
                e.printStackTrace();
            }
            return "";
        }
    


    Everything was done for educational purposes. If you have questions and suddenly need a more detailed description of something - write.
    I hope someone comes in handy.

    PS No need to write in the comments about the organization in capital letter “I”. Their Server uses SQLite, SSL, avcodec (ffmpeg), and in the \ Resources folder there is a divine file called camera_list.json, but my impudence did not allow it to be screwed to my program :) But I did not see Onvif support from them, apparently because they release "their" cameras. UPDATED: see comments from ivideon

    If you fasten OpenVPN and OpenCV to the program, then there will be a funny solution and a “bicycle”
    Well, here’s a useful link to the base of links for

    Git hub camera streams :

    Also popular now: