Turla - operations & development Turla - development & operations THE BIG PICTURE ANDRZEJ DERESZOWSKI Picture from deviantart.com Agenda  Introduction  Part I – Development  Part II – Operations  How to protect yourself  Attribution ? A bit of history from my perspective  2008 – Agent.BTZ – threat that hit Pentagon  2009 – Some Agent.BTZ incidents here and there  2011 – me, tecamac and other researchers get together to analyze certain complex threat  Beginning 2013 – we started distributing our report and help others handle infections  Beginning 2014 – a series of discoveries started by G-Data and BAE Systems What has been published so far ?  ThreatExpert – “Agent.btz - A Threat That Hit Pentagon” – Nov 2008  Trend Mirco - Windows XP/Server 2003 Zero-Day Payload Uses Multiple Anti- Analysis Techniques – Dec 2013  G-DATA – “Uroburos – Highly complex espionage software with Russian roots” – Feb 2014  BAE Systems – “Snake Campaign & Cyber Espionage Toolkit” – Mar 2014  Deresz&tecamac – “Uroburos – The Snake Rootkit” – Mar 2014  Sourcefire VRT – “Snake Campaign: A few words about the Uroburos Rootkit” – Apr 2014  F-Secure – “Anatomy of Turla Exploits” – May 2014  Kernelmode.info threads – Jun 2014  CIRCL – “TR-25 Analysis - Turla / Pfinet / Snake/ Uroburos” – Jul 2014  Symantec – “Turla: Spying tool targets governments and diplomats” – Aug 2014  Kaspersky – “The Epic Turla Operation – Aug 2014 Many publications – many names  Currently there is a lot of confusion in naming scheme  Final stage: Agent.BTZ/Snake/Turla/Uroburos/Carbon/Pfinet/Snark/Sengoku  Reconaissance stage: Epic/Tavdig/WipBot/WorldCupSec/TadjMakhal  NOT all of them decribe the same « product » PART I Development What is Turla ?  Family of related sophisticated backdoor software  Name comes from Microsoft detection signature – anagram of Ultra (Ultra3 was a name of the fake driver)  All related by shared code Code history Agent.BTZ (2006) Pfinet (2009) Snake (2011) Agent.BTZ: - Sengoku Pfinet: - Carbon - Usermorde-centric Snake Snake: - Urouros - Snark - Kernelmode-centric Snake Features: summary Feature Agent.BTZ Pfinet Snake Storage Hidden folder Encrypted VFS Encrypted VFS Configuration Hardcoded Text file Key-value store (queue) Networking Separate exes Userland payloads Kernel+userland Incomming transports No No Yes VirtualBox exploit to load the driver Source: Sourcefire VRT Uses a vulnerability in old (yet still signed!) VirtualBox driver to load its own driver Source: F-Secure Source: kernelmode.info Pfinet Snake Udis86: on-the-fly manipulation of dissassembled code in live kernel Source: deresz & tecamac Agent.BTZ Pfinet Snake Hooking engine – Udis86 reused Source: deresz & tecamac Agent.BTZ Pfinet Snake Encrypted VFS  Implemented in Carbon and Snake  CAST 128 encryption used  Decryption/encryption implemented on low level by hooking sector processing mechanisms: Source: deresz & tecamac Pfinet Snake Encryped VFS Encrypted container located in %windows%\$NtUninstallQ817473$\hotfix.dat Source: deresz & tecamac Two volumes: permanent (mapped to a file on a real file system) and volatile storage Pfinet Snake Configuration mechanism  Agent.BTZ:  Config hardcoded in the user mode executables  Pfinet:  Configuration file stored on the VFS in a flat file: config.txt  Transports implemented in user mode  Usermode payloads hardcoded in the rootkit body:  cryptoapi.dll  inetpub.dll Configuration mechanism  SNAKE:  Uses « queue » file that contains configuration parameters in the form of key/value pairs  « queue » file located on the VFS  Queue contains:  Transports configuration  Userland payloads:  inj_snake_Win32.dll – a counterpart of a rootkit for userland  inj_services_Win32.dll  rkctl_Win32.dll Modular transports – channel elements Snake Type 5 (s) np \\.\pipe\P enc frag reliable Type 2 (m) Type 4 (d) doms domc udp network Type 3 (t) t2mmpx Type 1 (b) tcp network sicmp m2d m2b converters d2s Modular transports – combined together frag enct2mmpx reliable doms np \\.\pipe\P domc np \\.\pipe\P Datagram covert channel enc frag np \\.\pipe\P HTTP covert channel Kernel mode User mode Snake enc.frag.np domc.np frag.enc.reliable.doms.np Protocols to choose  Datagram covert channels: - Raw layer 2 (Ethernet type 0x7FF) - Raw ICMP - Raw UDP - DNS - Raw IP  Stream covert channels and activation triggers: - Raw TCP - HTTP: URL parameters of an HTTP request - HTTP: Hidden in HTTP headers - HTTP: Hidden in local part of the URL - SMTP: triggered by a recipient e-mail address Examples of incoming transports – covert channels SMTP covert channel – rootkit resides on the mail server of pwned-prg.com HELO whatever.com 250 Hello whatever.com, I am glad to meet you MAIL FROM: 250 OK RCPT TO: 250 OK 354 End data with . . Recipient user name must be 10 characters Last two characters (in red) are the checksum calculated on the first 8: username[9] == sum / 26 + 65 username[10] == 122 - sum % 26 Examples of incoming transports HTTP covert channel – rootkit resides on the web server of pwned-prg.com GET / HTTP/1.1 SomeHeader: trueburgerYmFzZTY0ZW5jb2RlZCBzdHJpbmcKYmFz … - Same signature calculated on the first 10 bytes of the header value - Base 64 content that follows is decoded and XOR-ed back with raw buffer starting at offset 0 - First four bytes of the resulting content is a magic value, by default set to 0xDEADBEEF but changed by the initialization queue SNORT signatures – difficult to create ! Possible to create Surricata sigs with the use of LUA Big picture view of compromised network Developers  Vlad, gilg, urik  Version control info present in some of the samples: $Id: snake_config.c 5204 2007-01-04 10:28:19Z vlad $ $Id: mime64.c 12892 2010-06-24 14:31:59Z vlad $ $Id: event.c 14097 2010-11-01 14:46:27Z gilg $ $Id: named_mutex.c 15594 2011-03-18 08:04:09Z gilg $ $Id: nt.c 20719 2012-12-05 12:31:20Z gilg $ $Id: ntsystem.c 19662 2012-07-09 13:17:17Z gilg $ $Id: rw_lock.c 14516 2010-11-29 12:27:33Z gilg $ $Id: rk_bpf.c 14518 2010-11-29 12:28:30Z gilg $ $Id: t_status.c 14478 2010-11-27 12:41:22Z gilg $ $Id: l1_check.c 4477 2006-08-28 15:58:21Z vlad $ $Id: m2_to_b2_stub.c 4477 2006-08-28 15:58:21Z vlad $ $Id: m_frag.c 8715 2007-11-29 16:04:46Z urik $ Who are they ? PART II Operations Hmm, which group are we talking about ?  Not sure we can speak about one « Turla group »  Turla is just one of the tools « Turla group(s) » uses  There is however a lot of common things …  While the tool itself is quite impressive, operators that are using it are sloppy … Countries of interest Source: BAE Systems Publicly known victims Turla: Staged operation  Stage 0 – attack stage - infection vector  Stage 1 – reconaissance stage - initial backdoor  Stage 2 – lateral movements  Stage 3 – « access established » stage – TURLA deployed  On each stage they can quit if it turns out that the « non-interesting » victim has been encountered Stage 0: infection vector  Traditional infection vector – spear phishing: exploits (CVE-2013-3346 + CVE-2013-5065)  Watering holes (strategic web compromise)  “Adobe update” social engineering trick  Java exploits (CVE-2012-1723), Adobe Flash exploits (unknown) or Internet Explorer 6, 7, 8 exploits (unknown)  Third party suppliers compromise  No use of 0-day exploits (almost no) Stage 0: Watering holes mechanism Source: Kaspersky Lab Stage 0: Watering hole panel Stage 0: Web shell Stage 1: reconaissance stage  Initial backdoor dropped – WipBot/Epic/TavDig  Simple backdoor with a handful of commands  Has no code in common with any variant of Turla but exports functions with the same names: ModuleStart and ModuleStop  Well desribed in Kaspersky Lab report: Stage 1: Some interesting tricks used in WipBot  CVE-2013-3346 - zero day used together with a known exploit  Sets ThreadHideFromDebugger – breaks debugging  Creates a new process in suspended mode and maps the same section of memory twice, in two different processes  SetWindowsLong API call to start a thread in the newly created process – breaks most malware sandboxes  Jumps several times from one process to another  Wipes out the PE section so that it is harder to rebuild the unpacked executable Source: Trend Micro Stage 2: lateral movements  Stage 1 C&C servers are easy targets – for example, they can be caught in spear phishing e-mails and sandbox  Stage 2 backdoor: So let’s replace this by a less known backdoor  Go after Domain Admin credentials  Further explore and compromise the network Stage 3: Turla  Network has been found interesting to explore long-term and exfiltrate  Is fully compromised  Turla dropped on chosen machines  Usage of many other tools  Some networks owned for years… How to detect Turla ?  Not very easy task …  One fun story to tell   Do not only rely on vendors - talk to your partner organizations  Establish relationships and share information !  IOCs exchange is good but not enough these days:  They are easy to change by the intruders  Separate samples and infras used for different victims  Good Yara sigs and custom detection tools can help  Check your third party suppliers – for intruders it’s a perfect way to get in Divagations on attribution  Development:  Vlad, gilg, urik  “Transmittion”, “Password it’s wrong” etc.  Zagruzchik.dll  Operations:  Geographic distribution of infections  Virustotal submission countries  $default_charset = 'Windows-1251'; Questions ? deresz@gmail.com @deresz666