Deploy a static web page with protection of specific static resources on AWS S3
Last updated
Was this helpful?
Last updated
Was this helpful?
I have a static html page:
What I want is:
When anyone browses this page, the Javascript script can fetch json data (marked with red rectangle above).
When anyone directly opens the json data link, it should DENY the request, so they cannot get raw data.
Upload static web page(s) at one public S3 bucket (B1)
Enable B1's Properties -> Static website hosting
Upload static sources that should be protected at another public S3 bucket (B2)
Setup B2's Permissions -> Bucket policy:
{
"Version": "2012-10-17",
"Id": "PreventHotLinking",
"Statement": [
{
"Sid": "Allow get requests referred by specific site",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::www2.kingsharkworld.com/*",
"Condition": {
"StringLike": {
"aws:Referer": "http://www1.kingsharkworld.com.s3-website.ap-northeast-2.amazonaws.com/*"
}
}
},
{
"Sid": "Explicit deny to ensure requests are allowed only from specific referer.",
"Effect": "Deny",
"Principal": "*",
"Action": "s3:*",
"Resource": "arn:aws:s3:::www2.kingsharkworld.com/*",
"Condition": {
"StringNotLike": {
"aws:Referer": "http://www1.kingsharkworld.com.s3-website.ap-northeast-2.amazonaws.com/*"
}
}
}
]
}
Setup B2's Permissions -> Cross-origin resource sharing (CORS), allow B1's domain and HTTP Methods.
[
{
"AllowedHeaders": [
"*"
],
"AllowedMethods": [
"PUT",
"POST",
"DELETE",
"GET"
],
"AllowedOrigins": [
"http://www1.kingsharkworld.com.s3-website.ap-northeast-2.amazonaws.com"
],
"ExposeHeaders": []
}
]
Disable B2's Properties -> Static website hosting.
B1's static html file can fetch B2's objects using each object's Object URL.