Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
a51b2e5
[REF] Several refactor for splitting amazon storage
sebastienbeau Apr 10, 2018
94660dc
[REF] split sftp backend in a separated module
sebastienbeau Apr 11, 2018
e21470d
[REF] rename method store and retrieve by more explicit method add/ge…
sebastienbeau Apr 11, 2018
b231f44
[REF] refactor test in order to use the same test between the differe…
sebastienbeau Apr 11, 2018
a57baa8
[IMP] add method for listing directory and deleting file on storage.b…
sebastienbeau Apr 13, 2018
9504c17
[REF] set all module to the category storage
sebastienbeau Apr 17, 2018
e2e8643
[FIX] clean with pre-commit and pep 8
bguillot Apr 10, 2019
8dd82eb
[REF] refactor test for checking access right and refactor S3 testing
sebastienbeau Apr 11, 2019
68fcf91
[IMP] add tests and support pilbox for thumbnail
bguillot Apr 12, 2019
8557a47
[12.0] storage*: Make installable False
rousseldenis Jun 7, 2019
04dca8f
[FIX] __manifest__: Uses github repo url as website and add OCA into …
lmignon Sep 24, 2019
9d9b8f2
pre-commit, black, isort
sbidoul Oct 1, 2019
d43344f
[MIG] storage_backend_s3: Migration to 12.0
simahawk Oct 14, 2019
8986521
storage_backend_s3: improvements
simahawk Nov 2, 2019
6bcf264
[UPD] Update storage_backend_s3.pot
oca-travis Nov 4, 2019
8cd8956
[ADD] icon.png
OCA-git-bot Nov 4, 2019
914a831
S3: add file ACL control
simahawk Nov 22, 2019
2e6815c
Fix runbot warning on clashing labels
simahawk Nov 22, 2019
214d6b9
Add server_env support
simahawk Nov 22, 2019
672de0f
[UPD] Update storage_backend_s3.pot
oca-travis Nov 25, 2019
1212910
storage_backend_s3 12.0.2.0.0
OCA-git-bot Nov 25, 2019
564034a
[IMP] storage_backend_s3: black, isort, prettier
JasminSForgeFlow Jul 21, 2022
6b237ca
[MIG] storage_backend: Migration to 13.0
Oct 22, 2019
c2375b6
[NEW] Make s3 compatible with services that need region_name eg: scal…
mileo Nov 11, 2019
0309a2d
[NEW] Test fake s3
mileo Nov 12, 2019
d4658cf
storage_backend_s3: fix other region name handling
simahawk Jan 16, 2020
f63543d
S3 aws_other_region: support env override
simahawk Jan 16, 2020
822fdd0
[UPD] Update storage_backend_s3.pot
oca-travis Jan 16, 2020
acfaf5a
pre-commit update
OCA-git-bot Mar 14, 2020
1d51885
storage_backend: run permission tests explicitely
simahawk Oct 29, 2020
1f390c1
storage_backend_s3 bump 13.0.1.1.0
simahawk Nov 23, 2020
5c92a11
[ADD] add new V14 config
sebastienbeau Dec 6, 2020
bb23b10
[IMP] all: black, isort, prettier
sebastienbeau Dec 6, 2020
0438684
[MIG] batch migration of modules
sebastienbeau Dec 6, 2020
62c282a
storage_backend_s3 14.0.1.0.1
OCA-git-bot Mar 1, 2021
733b955
[UPD] Update storage_backend_s3.pot
oca-travis Jun 9, 2021
afe54ca
[CHG] storage: Use more permissive licence: AGPL-> LGPL
etobella Mar 10, 2021
052ff8d
storage_backend_s3 14.0.2.0.0
OCA-git-bot Aug 2, 2021
ec8410d
storage_s3: fix aws regions lookup to load once
simahawk Feb 3, 2021
f5d9afd
storage_backend_s3 14.0.2.0.1
OCA-git-bot Nov 30, 2021
6da2749
[UPD] Reflect boto3 version issue in readme
Mat-moran Dec 1, 2021
47791bc
[MIG] storage_backend_s3: Migration to 15.0
JasminSForgeFlow Jul 21, 2022
df427b4
[UPD] Update storage_backend_s3.pot
Oct 18, 2022
6bac369
[UPD] README.rst
OCA-git-bot Oct 18, 2022
2e7fdc6
[IMP] boto3 version bump
MiquelRForgeFlow Oct 19, 2022
73b4a17
[UPD] README.rst
OCA-git-bot Oct 20, 2022
3ebc67a
storage_backend_s3 15.0.1.0.1
OCA-git-bot Oct 20, 2022
f49461b
[UPD] README.rst
OCA-git-bot Sep 3, 2023
feff193
[MIG] storage_backend_s3: Migration to 16.0
IJOL Feb 3, 2026
9bae181
[FIX] storage_backend_s3: add vcrpy-unittest to test-requirements
IJOL Feb 3, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# generated from manifests external_dependencies
boto3>=1.35.0,<1.42.31
fsspec>=2024.5.0
fsspec>=2025.3.0
fsspec[s3]
Expand Down
1 change: 1 addition & 0 deletions setup/storage_backend_s3/odoo/addons/storage_backend_s3
6 changes: 6 additions & 0 deletions setup/storage_backend_s3/setup.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
import setuptools

setuptools.setup(
setup_requires=['setuptools-odoo'],
odoo_addon=True,
)
83 changes: 83 additions & 0 deletions storage_backend_s3/README.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
==================
Storage Backend S3
==================

..
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! This file is generated by oca-gen-addon-readme !!
!! changes will be overwritten. !!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! source digest: sha256:76e4ebc7d37af8aa2744514100e89ce4934c00275d1271e22f1d0249bedff7fa
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

.. |badge1| image:: https://img.shields.io/badge/maturity-Beta-yellow.png
:target: https://odoo-community.org/page/development-status
:alt: Beta
.. |badge2| image:: https://img.shields.io/badge/licence-LGPL--3-blue.png
:target: http://www.gnu.org/licenses/lgpl-3.0-standalone.html
:alt: License: LGPL-3
.. |badge3| image:: https://img.shields.io/badge/github-OCA%2Fstorage-lightgray.png?logo=github
:target: https://github.com/OCA/storage/tree/16.0/storage_backend_s3
:alt: OCA/storage
.. |badge4| image:: https://img.shields.io/badge/weblate-Translate%20me-F47D42.png
:target: https://translation.odoo-community.org/projects/storage-16-0/storage-16-0-storage_backend_s3
:alt: Translate me on Weblate
.. |badge5| image:: https://img.shields.io/badge/runboat-Try%20me-875A7B.png
:target: https://runboat.odoo-community.org/builds?repo=OCA/storage&target_branch=16.0
:alt: Try me on Runboat

|badge1| |badge2| |badge3| |badge4| |badge5|

Add the possibility to store and get data from amazon S3 for your storage backend

**Table of contents**

.. contents::
:local:

Known issues / Roadmap
======================

There is an issue with the latest version of `boto3` and `urllib3`
- boto3 needs to be `boto3<=1.15.18` related with https://github.com/OCA/storage/issues/67

Bug Tracker
===========

Bugs are tracked on `GitHub Issues <https://github.com/OCA/storage/issues>`_.
In case of trouble, please check there if your issue has already been reported.
If you spotted it first, help us to smash it by providing a detailed and welcomed
`feedback <https://github.com/OCA/storage/issues/new?body=module:%20storage_backend_s3%0Aversion:%2016.0%0A%0A**Steps%20to%20reproduce**%0A-%20...%0A%0A**Current%20behavior**%0A%0A**Expected%20behavior**>`_.

Do not contact contributors directly about support or help with technical issues.

Credits
=======

Authors
~~~~~~~

* Akretion

Contributors
~~~~~~~~~~~~

* Sebastien Beau <sebastien.beau@akretion.com>
* Raphaël Reverdy <raphael.reverdy@akretion.com>

Maintainers
~~~~~~~~~~~

This module is maintained by the OCA.

.. image:: https://odoo-community.org/logo.png
:alt: Odoo Community Association
:target: https://odoo-community.org

OCA, or the Odoo Community Association, is a nonprofit organization whose
mission is to support the collaborative development of Odoo features and
promote its widespread use.

This module is part of the `OCA/storage <https://github.com/OCA/storage/tree/16.0/storage_backend_s3>`_ project on GitHub.

You are welcome to contribute. To learn how please visit https://odoo-community.org/page/Contribute.
2 changes: 2 additions & 0 deletions storage_backend_s3/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
from . import models
from . import components
17 changes: 17 additions & 0 deletions storage_backend_s3/__manifest__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Copyright 2017 Akretion (http://www.akretion.com).
# @author Sébastien BEAU <sebastien.beau@akretion.com>
# License LGPL-3.0 or later (http://www.gnu.org/licenses/lgpl).

{
"name": "Storage Backend S3",
"summary": "Implement amazon S3 Storage",
"version": "16.0.1.0.0",
"category": "Storage",
"website": "https://github.com/OCA/storage",
"author": " Akretion, Odoo Community Association (OCA)",
"license": "LGPL-3",
"installable": True,
"external_dependencies": {"python": ["boto3>=1.35.0,<1.42.31"]},
"depends": ["storage_backend"],
"data": ["views/backend_storage_view.xml"],
}
1 change: 1 addition & 0 deletions storage_backend_s3/components/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from . import s3_adapter
121 changes: 121 additions & 0 deletions storage_backend_s3/components/s3_adapter.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,121 @@
# Copyright 2017 Akretion (http://www.akretion.com).
# @author Sébastien BEAU <sebastien.beau@akretion.com>
# Copyright 2019 Camptocamp SA (http://www.camptocamp.com).
# @author Simone Orsi <simone.orsi@camptocamp.com>
# License LGPL-3.0 or later (http://www.gnu.org/licenses/lgpl).

import io
import logging

from odoo import _, exceptions

from odoo.addons.component.core import Component

_logger = logging.getLogger(__name__)

try:
import boto3
from botocore.exceptions import ClientError, EndpointConnectionError

except ImportError as err: # pragma: no cover
_logger.debug(err)


class S3StorageAdapter(Component):
_name = "s3.adapter"
_inherit = "base.storage.adapter"
_usage = "amazon_s3"

def _aws_bucket_params(self):
params = {
"aws_access_key_id": self.collection.aws_access_key_id,
"aws_secret_access_key": self.collection.aws_secret_access_key,
}
if self.collection.aws_host:
params["endpoint_url"] = self.collection.aws_host

if self.collection.aws_region:
if self.collection.aws_region != "other":
params["region_name"] = self.collection.aws_region
elif self.collection.aws_other_region:
params["region_name"] = self.collection.aws_other_region
return params

def _get_bucket(self):
params = self._aws_bucket_params()
s3 = boto3.resource("s3", **params)
bucket_name = self.collection.aws_bucket
bucket = s3.Bucket(bucket_name)
exists = True
try:
s3.meta.client.head_bucket(Bucket=bucket_name)
except ClientError as e:
# If a client error is thrown, then check that it was a 404 error.
# If it was a 404 error, then the bucket does not exist.
error_code = e.response["Error"]["Code"]
if error_code == "404":
exists = False
except EndpointConnectionError as error:
# log verbose error from s3, return short message for user
_logger.exception("Error during connection on S3")
raise exceptions.UserError(str(error)) from error
region_name = params.get("region_name")
if not exists:
if not region_name:
bucket = s3.create_bucket(Bucket=bucket_name)
else:
bucket = s3.create_bucket(
Bucket=bucket_name,
CreateBucketConfiguration={"LocationConstraint": region_name},
)
return bucket

def _get_object(self, relative_path=None):
bucket = self._get_bucket()
path = None
if relative_path:
path = self._fullpath(relative_path)
return bucket.Object(key=path)

def add(self, relative_path, bin_data, mimetype=None, **kwargs):
s3object = self._get_object(relative_path)
file_params = self._aws_upload_fileobj_params(mimetype=mimetype, **kwargs)
with io.BytesIO() as fileobj:
fileobj.write(bin_data)
fileobj.seek(0)
try:
s3object.upload_fileobj(fileobj, **file_params)
except ClientError as error:
# log verbose error from s3, return short message for user
_logger.exception("Error during storage of the file %s" % relative_path)
raise exceptions.UserError(
_("The file could not be stored: %s") % str(error)
) from error

def _aws_upload_fileobj_params(self, mimetype=None, **kw):
extra_args = {}
if mimetype:
extra_args["ContentType"] = mimetype
if self.collection.aws_cache_control:
extra_args["CacheControl"] = self.collection.aws_cache_control
if self.collection.aws_file_acl:
extra_args["ACL"] = self.collection.aws_file_acl
if extra_args:
return {"ExtraArgs": extra_args}
return {}

def get(self, relative_path):
s3object = self._get_object(relative_path)
return s3object.get()["Body"].read()

def list(self, relative_path):
bucket = self._get_bucket()
dir_path = self.collection.directory_path or ""
return [
o.key.replace(dir_path, "").lstrip("/")
for o in bucket.objects.filter(Prefix=dir_path)
]

def delete(self, relative_path):
s3object = self._get_object(relative_path)
s3object.delete()
116 changes: 116 additions & 0 deletions storage_backend_s3/i18n/storage_backend_s3.pot
Original file line number Diff line number Diff line change
@@ -0,0 +1,116 @@
# Translation of Odoo Server.
# This file contains the translation of the following modules:
# * storage_backend_s3
#
msgid ""
msgstr ""
"Project-Id-Version: Odoo Server 15.0\n"
"Report-Msgid-Bugs-To: \n"
"Last-Translator: \n"
"Language-Team: \n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: \n"
"Plural-Forms: \n"

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_host
msgid "AWS Host"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_access_key_id
msgid "Access Key ID"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__backend_type__amazon_s3
#: model_terms:ir.ui.view,arch_db:storage_backend_s3.storage_backend_view_form
msgid "Amazon S3"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_cache_control
msgid "Aws Cache Control"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_file_acl
msgid "Aws File Acl"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__backend_type
msgid "Backend Type"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_bucket
msgid "Bucket"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,help:storage_backend_s3.field_storage_backend__aws_host
msgid "If you are using a different host than standard AWS ones, eg: Exoscale"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_other_region
msgid "Other region"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_region
msgid "Region"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields,field_description:storage_backend_s3.field_storage_backend__aws_secret_access_key
msgid "Secret Access Key"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model,name:storage_backend_s3.model_storage_backend
msgid "Storage Backend"
msgstr ""

#. module: storage_backend_s3
#: code:addons/storage_backend_s3/components/s3_adapter.py:0
#, python-format
msgid "The file could not be stored: %s"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__authenticated-read
msgid "authenticated-read"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__aws-exec-read
msgid "aws-exec-read"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__bucket-owner-full-control
msgid "bucket-owner-full-control"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__bucket-owner-read
msgid "bucket-owner-read"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__private
msgid "private"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__public-read
msgid "public-read"
msgstr ""

#. module: storage_backend_s3
#: model:ir.model.fields.selection,name:storage_backend_s3.selection__storage_backend__aws_file_acl__public-read-write
msgid "public-read-write"
msgstr ""
1 change: 1 addition & 0 deletions storage_backend_s3/models/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from . import storage_backend
Loading