{
	"id": "507545d8-de97-4785-96f3-bcac196ed5a6",
	"created_at": "2026-04-06T01:29:30.206965Z",
	"updated_at": "2026-04-10T13:12:06.616529Z",
	"deleted_at": null,
	"sha1_hash": "0c9aab3401ec584bce65a8f209b32d5a042c9c14",
	"title": "Head in the Clouds",
	"llm_title": "",
	"authors": "",
	"file_creation_date": "0001-01-01T00:00:00Z",
	"file_modification_date": "0001-01-01T00:00:00Z",
	"file_size": 900939,
	"plain_text": "Head in the Clouds\r\nBy Christopher Maddalena\r\nPublished: 2018-09-12 · Archived: 2026-04-06 01:20:52 UTC\r\nMeet the Big Three\r\nIn the cloud space, there are three major providers at this time: Amazon Web Services (AWS), Microsoft Azure,\r\nand Google Compute. Each provider offers relatively low cost services for computing, storage, load balancing,\r\nand more.\r\nThere are too many services to cover in one article, so this article will focus on the basics of the computing and\r\nstorage options, how to track down these cloud assets, and what can be done with assets setup with insecure\r\nconfigurations. This will be a primer for anyone looking to take a closer look at cloud resources from the\r\nperspective of a security auditor, penetration tester, or bounty hunter. Informational articles for each provider were\r\nmade to accompany this article. Links are posted at the bottom of the page.\r\nThe IP addresses used by these services is a good a place as any to start.\r\nIP Address Ranges\r\nThe IP addresses used by the three cloud providers are all published, but the lists are not all simple to fetch.\r\nAmazon Web Services\r\nhttps://posts.specterops.io/head-in-the-clouds-bd038bb69e48\r\nPage 1 of 8\n\nAmazon provides the simplest option for fetching their IP addresses by offering a webpage with easily digested\nJSON:\nhttps://ip-ranges.amazonaws.com/ip-ranges.json\nThat is all there is to it. The JSON makes it simple to do something like create a script that makes a single web\nrequest and parses the JSON.\nMicrosoft Azure\nMicrosoft’s solution is an XML document. Not as easy as JSON, but trivial to parse with a script. Unfortunately,\nthe document is always changing and must be downloaded from the Microsoft Download Center:\nThe downloaded document will have a name that includes the date it was last updated, e.g.\nPublicIPs_20180813.xml. The XML is made-up of Region nodes that name the region, such as “australiac2,” and\nincludes the IP address ranges in lines like this one:\nGoogle Compute\nGoogle has, by far, the most convoluted solution, which is detailed here in the Google Compute Engine\ndocumentation:\nTo get the IP address ranges, one must first fetch the TXT DNS record for _cloud-netblocks.googleusercontent.com, like so:\nnslookup -q=TXT _cloud-netblocks.googleusercontent.com 8.8.8.8\nServer: 8.8.8.8\nAddress: 8.8.8.8#53\nNon-authoritative answer:\n_cloud-netblocks.googleusercontent.com text = “v=spf1 include:_cloud-netblocks1.googleusercontent.com include:_cloud-netblocks2.googleusercontent.com include:_cloud-netblocks3.googleusercontent.com include:_cloud-netblocks4.googleusercontent.com include:_cloud-netblocks5.googleusercontent.com ?all”\nThe returned entries in the SPF record for the “_cloud-netblocks#” names contain the information about Google\nCompute’s current IP address ranges. To get the list, additional lookups must be performed one at a time for each\n_cloud-netblocks hostname:\nnslookup -q=TXT _cloud-netblocks1.googleusercontent.com 8.8.8.8\nServer: 8.8.8.8\nAddress: 8.8.8.8#53\nNon-authoritative answer:\n_cloud-netblocks1.googleusercontent.com text = “v=spf1 include:_cloud-https://posts.specterops.io/head-in-the-clouds-bd038bb69e48\nPage 2 of 8\n\nnetblocks6.googleusercontent.com ip4:8.34.208.0/20 ip4:8.35.192.0/21 ip4:8.35.200.0/23\r\nip4:108.59.80.0/20 ip4:108.170.192.0/20 ip4:108.170.208.0/21 ip4:108.170.216.0/22\r\nip4:108.170.220.0/23 ip4:108.170.222.0/24 ip4:35.224.0.0/13 ?all”\r\nMaking Use of the IP Addresses\r\nAn up-to-date list of these IP addresses is useful for identifying assets hosted “in the cloud.” If an organization has\r\na domain or subdomain that points back to an IP address in the list it will lead to a storage bucket or cloud server.\r\nMaintaining an Updated Master List\r\nThese tasks are all automated in the following script:\r\nThe script fetches the latests IP address ranges used by each provider and then outputs all of them as one list in a\r\nCloudIPs.txt file. Each range is on a new line following a header naming the service, e.g. “# Amazon Web\r\nServices IPs.”\r\nWith a list of IP addresses on hand, it is time to look at two of the major services that use these addresses.\r\nThe Storage Services: Buckets\r\nThe common term for a cloud storage container is a “bucket.” They are terribly useful things for doing everything\r\nfrom basic file storage to offsite backups and web hosting. They have also been at the center of many a recent\r\ninformation leakage blunder due to how easy they are to misconfigure them and make files available to the public\r\ninternet that never should have been.\r\nWhen it comes to seeking out vulnerabilities in the cloud, public buckets take the cake for ease of discovery and\r\npotential for high impact.\r\nCommon Pitfalls\r\nToday, buckets are private by default and a user must view some warnings before a bucket can be made public.\r\nThe problem is public buckets are not inherently bad. They are useful for web hosting and file sharing, so a user\r\nmay legitimately want a bucket to be public. The issue is the access controls can be misunderstood and sensitive\r\ndocuments might be placed inside a bucket where the uploader does not realize they will be made available to the\r\ninternet at large.\r\nEach service enables users to specify access control lists with users like “allUsers” or “Everyone” to allow public\r\naccess. The slightly more restrictive options include AWS’ “Any Authenticated AWS User” and Google’s\r\n“allAuthenticated Users.” These may sound like they mean authenticated users associated with the account, but\r\nthey actually mean any user authenticated with AWS or Google. These misunderstandings have led to numerous\r\ndocuments made public that never should have been.\r\nCommon Bucket HTTP Responses\r\nhttps://posts.specterops.io/head-in-the-clouds-bd038bb69e48\r\nPage 3 of 8\n\nEach provider returns XML in response to a web request for a bucket and most use the same XML schema. These\nstandardized responses make it possible to do something as simple as brute force web requests to enumerate\nbucket names and record whether or not they contain publicly accessibly files.\nThe services return this response when a non-existent bucket is requested:\n`NoSuchBucket`The specified bucket does not exist. If the bucket exists, the XML will start with:\ncmaddy1000false …\nThe XML will then display data for any files that are accessible. If the bucket exists but no access whatsoever is\nallowed, this response is returned:\n`AccessDenied`Access denied.\n\nAnonymous caller does not have storage.objects.list access to test.\n\nNote: This also works for enumerating Digital Ocean “Spaces,” their service offering for buckets and cloud\nstorage.\nhttps://posts.specterops.io/head-in-the-clouds-bd038bb69e48\nPage 4 of 8\n\nThe exception to this is Azure. Azure will return XML responses, but requires some additional information in the\r\nweb request before it will return a resource. This is covered in-depth in the Azure article/cheat sheet linked below.\r\nThe Computing Services: Virtual Private Servers\r\nEach provider also offers users the ability to run virtual machines on their infrastructure. The uses range from\r\nshort-lived VMs stood up for testing to semi-permanent machines used for hosting applications over a longer\r\nperiod of time.\r\nAWS refers to their offering as Elastic Compute Cloud, or EC2. It is common to hear someone refer to “an EC2”\r\nwhen they mean a virtual machine hosted by AWS’ EC2 service.\r\nGoogle Compute calls their offering Compute Engine.\r\nAzure just calls the service “virtual machines” on the dashboard. Microsoft keeps it really simple.\r\nCommon Pitfalls\r\nA common issue with these VMs is they are too often not given the respect they deserve and can be easily\r\nforgotten. First, someone may stand up a VM to test a deployment of a new application with some production data\r\nto use for QA or user acceptance testing. This VM will be short-lived, so this someone does not bother with\r\nsecuring the application or hardening the VM image. They may not even bother documenting its existence.\r\nThey make these decisions, perhaps, because they assume no one will ever find the machine or the application\r\nbefore the machine is shutdown. The problems begin when that assumption is wrong or they forget to kill the\r\nmachine.\r\nAs discussed earlier, the IP addresses of these services are documented and it is best to assume someone is always\r\nwatching for an opportunity to swoop in and investigate a cloud server running a default configuration.\r\nFor the purposes of this article, it suffices to say these VMs may be just like an organization’s internal hosts. They\r\ncan contain some of the same data and may even be linked to internal networks, except they are less likely to be\r\nhardened or monitored like an internal asset.\r\nThere are a couple of interesting items that make these VMs different from an ordinary web server in a datacenter.\r\nThe Metadata Service\r\nEach cloud service uses a special IP address, 169.254.169.254, with their virtual machines. This address is used\r\nfor the metadata service, which is documented in these locations:\r\nAzure Instance Metadata Service\r\nThe Azure Instance Metadata Service provides information about running virtual machine\r\ninstances that can be used to…\r\ndocs.microsoft.com\r\nhttps://posts.specterops.io/head-in-the-clouds-bd038bb69e48\r\nPage 5 of 8\n\nThe service offers an array of information about the VM instance. This is everything from the hostname to much\r\nmore sensitive data about the host and the account to which it is attached. For example, executing this curl\r\ncommand on an EC2 host will return security credentials associated with the host, if any:\r\ncurl http://169.254.169.254/latest/meta-data/iam/security-credentials/\r\nIf the metadata service is somehow exposed, such as via a Server Side Request Forgery vulnerability in an\r\napplication, an attacker could learn quite a bit about the host and even gain some level of access to a management\r\naccount by swiping security credentials. Security credentials are covered in the next section.\r\nSnapshots\r\nUsers can create snapshots of virtual machines to save as backups. By default, snapshots are private. However,\r\nnumerous snapshots have been, and continue to be, made public for some reason. Many of these snapshots likely\r\ncontain nothing of interest, but with each public snapshot there is always the chance someone setup a VM just the\r\nway they like it, created a snapshot, and then mistakenly made that snapshot public along with the machine’s\r\nconfig files and scripts containing passwords and other sensitive data.\r\nTo demonstrate the point, running this command using  awscli  will return all public snapshots available in AWS’\r\nus-west-2 region:\r\naws ec2 describe-snapshots --region us-west-2\r\nAt the time of this writing that command returned 16,126 public snapshots in this one region. These snapshots can\r\nbe copied, mounted, and then browsed for interesting data. Some useful commands, like the above, are covered in\r\nthe cheat sheet articles linked below.\r\nIdentity Management: Accounts, Keys, and Access\r\nEach service has a root account, the account that has full control over the cloud services. These accounts can then\r\ndelegate access by issuing access keys and tokens or adding new users to the account.\r\nEach service handles this a little bit differently. The differences between each service are detailed in the individual\r\ninformation articles linked below, but the general idea is to avoid using the root account and make use of new\r\nusers created for specific purposes.\r\nCompromised Accounts\r\nA compromised account or set of access keys/tokens is a potentially serious issue. This occasionally happens\r\ncompletely by accident. Access keys are often used in scripts and may be accidentally committed to code\r\nrepositories. Even when that slip-up is caught, the keys may not be revoked or completely removed from the\r\nrepository (removing the keys and committing that change does not remove them from the git history).\r\nEven if the compromised keys have very limited or read-only access to services, they can still be a boon to an\r\nattacker. For example, even though the keys may not be used to access a particular storage bucket, there is nothing\r\nto stop someone from listing all available buckets on the account. Basic read-only enumeration can provide a great\r\nhttps://posts.specterops.io/head-in-the-clouds-bd038bb69e48\r\nPage 6 of 8\n\ndeal of information, like the name of a misconfigured bucket containing sensitive data that might have otherwise\r\nbeen nearly impossible to discover or tie to the organization.\r\nAccessing Services: Using the CLI Tools\r\nEach service offers a web user interface and dashboard, but they also all offer their own command line tools.\r\nThese tools can be used to easily interact with the services, assuming a valid username and password combination\r\nor set of keys are available.\r\nThese tools are used for managing and interacting with the various available services, like uploading files to a\r\nbucket or creating a new virtual machine. A potential downside of using these tools is anyone else with access to\r\nan authenticated user’s workstation could abuse them to enumerate a cloud environment.\r\nCLI Authentication Abuses\r\nOnce a user authenticates, their credentials are locally stored in config and credential files. In some cases, these\r\nfiles can be copied and reused on another machine or the contents can be read and then used to authenticate\r\nwithout needing to copy any files. The specifics of how each service handles authentication are detailed in the\r\nindividual information articles linked below.\r\nAs a proof of concept, this tool was created to search for and collect these files:\r\nchrismaddalena/SharpCloud\r\nSharpCloud – Simple C# for checking for the existence of credential files related to AWS,\r\nMicrosoft Azure, and Google…\r\ngithub.com\r\nSharpCloud is a simple, basic C# console application that checks for the credential and config files associated\r\nwith each cloud provider. If found, the contents of each found file is dumped for collection and reuse.\r\nConclusion\r\nThere is no doubt that cloud services are an amazing resource for individuals and enterprises. The ability to spin\r\nup a virtual machine for pennies or make use of a bucket as a cheap off-site backup solution for a home or small\r\noffice is fantastic and the services have only gotten easier to use over time.\r\nhttps://posts.specterops.io/head-in-the-clouds-bd038bb69e48\r\nPage 7 of 8\n\nThe downside is cloud services seem to have caused everyone to take a step backwards in some aspects when it\r\ncomes to security. The old idea of “security through obscurity” has made a resurgence with cloud services because\r\nit is so easy to assume a cloud server or bucket cannot be linked to a specific organization because they exist\r\nwithin huge net blocks owned by three of the major technology companies.\r\nThat is why cloud services demand attention when it comes to security audits, penetration tests, and other security\r\nreviews. Hopefully this article has helped shed some light on why cloud servers and storage buckets are not the\r\nneedles in a haystack they may appear to be at first.\r\nThe following links lead to three articles written as companion pieces for this article. There is one for each\r\nprovider and each one contains provider-specific information and a deeper look at the command line tools and\r\nidentity management.\r\nAmazon Web Services\r\nGoogle Compute\r\nMicrosoft Azure\r\nContinued Education\r\nAlso, the flAWS.cloud website is a great resource for walking through many of the common issues outlined\r\nabove. It has six stages that each deal with insecure buckets, virtual servers, and snapshots. As the name suggests,\r\nflAWS deals with AWS services and misconfigurations, but these lessons easily translate to Compute and Azure.\r\nPost Views: 1,258\r\nSource: https://posts.specterops.io/head-in-the-clouds-bd038bb69e48\r\nhttps://posts.specterops.io/head-in-the-clouds-bd038bb69e48\r\nPage 8 of 8",
	"extraction_quality": 1,
	"language": "EN",
	"sources": [
		"MITRE"
	],
	"origins": [
		"web"
	],
	"references": [
		"https://posts.specterops.io/head-in-the-clouds-bd038bb69e48"
	],
	"report_names": [
		"head-in-the-clouds-bd038bb69e48"
	],
	"threat_actors": [
		{
			"id": "d90307b6-14a9-4d0b-9156-89e453d6eb13",
			"created_at": "2022-10-25T16:07:23.773944Z",
			"updated_at": "2026-04-10T02:00:04.746188Z",
			"deleted_at": null,
			"main_name": "Lead",
			"aliases": [
				"Casper",
				"TG-3279"
			],
			"source_name": "ETDA:Lead",
			"tools": [
				"Agentemis",
				"BleDoor",
				"Cobalt Strike",
				"CobaltStrike",
				"RbDoor",
				"RibDoor",
				"Winnti",
				"cobeacon"
			],
			"source_id": "ETDA",
			"reports": null
		}
	],
	"ts_created_at": 1775438970,
	"ts_updated_at": 1775826726,
	"ts_creation_date": 0,
	"ts_modification_date": 0,
	"files": {
		"pdf": "https://archive.orkl.eu/0c9aab3401ec584bce65a8f209b32d5a042c9c14.pdf",
		"text": "https://archive.orkl.eu/0c9aab3401ec584bce65a8f209b32d5a042c9c14.txt",
		"img": "https://archive.orkl.eu/0c9aab3401ec584bce65a8f209b32d5a042c9c14.jpg"
	}
}