Compare commits
14 Commits
727b03dcf8
...
main
Author | SHA1 | Date | |
---|---|---|---|
d52f5ca2b9 | |||
3d0f2006b9 | |||
2dad2f3087 | |||
7c7f7a0218 | |||
f63c204b12 | |||
e465497714 | |||
![]() |
894fe53725 | ||
![]() |
2737535866 | ||
![]() |
a28bc043f5 | ||
![]() |
0d05a2e55e | ||
f6528f8f06 | |||
01c9d96c47 | |||
f582c360b4 | |||
2f7469be9c |
BIN
.demo2-instance-with-init-script.py.swp
Normal file
BIN
.demo2-instance-with-init-script.py.swp
Normal file
Binary file not shown.
98
README.md
Normal file
98
README.md
Normal file
@@ -0,0 +1,98 @@
|
|||||||
|
# Cloud Computing Project Submission
|
||||||
|
Group Number 2
|
||||||
|
|
||||||
|
Members:
|
||||||
|
- Emin Arslan 1581975, fd0003933
|
||||||
|
- Yash Sharma 1573145, fd0003398
|
||||||
|
- Sagarkumar Dipakbhai Rokad 1566440, fd0003372
|
||||||
|
- Abhimanyu Rajesh Kanase 1569090, fd0003385
|
||||||
|
|
||||||
|
|
||||||
|
## First Task
|
||||||
|
|
||||||
|
First we understood the architecture of the faafo application and commands.
|
||||||
|
Then we changed the repository link in the install script for faafo to point
|
||||||
|
to Emin's [custom git repository](https://git.emin.software/haxala1r/cloud-computing-msc-ai-examples). This was necessary because the demo
|
||||||
|
deployment scripts pull from the remote repository when installing faafo
|
||||||
|
on an instance and local changes are ignored.
|
||||||
|
|
||||||
|
After that, we added a few custom commands to the faafo command line tool.
|
||||||
|
First we added a delete-all command, which deletes all fractals. Second,
|
||||||
|
we added a delete-selected command, which takes a list of fractal UUIDs and
|
||||||
|
deletes all of them. By adding these custom commands we achived a greater
|
||||||
|
understanding of the faafo command line tool so that we can add more commands
|
||||||
|
as needed in the future. We also added help messages for the new commands
|
||||||
|
to provide a better user experience.
|
||||||
|
|
||||||
|
List of changed files for task 1:
|
||||||
|
- faafo/contrib/install.sh (changed repository link)
|
||||||
|
- demo2-instance-with-init-script.py (changed repository link)
|
||||||
|
- demo3-microservice.py (changed repository link)
|
||||||
|
- demo4-1-scale-out.py (changed repository link)
|
||||||
|
- demo4-2-scale-out-add-worker.py (changed repository link)
|
||||||
|
- faafo/bin/faafo (added commands)
|
||||||
|
|
||||||
|
|
||||||
|
## Second Task
|
||||||
|
|
||||||
|
The faafo application as given stores all image data, including the image file
|
||||||
|
itself in an SQL database. For the second task we changed the faafo API and
|
||||||
|
worker programs (faafo/faafo/api/service.py and faafo/faafo/worker/service.py) to store the image file in OpenStack object storage. Other
|
||||||
|
data about the image will still be stored in the database.
|
||||||
|
|
||||||
|
We changed the API server such that it will no longer store the image as a blob
|
||||||
|
in the database. Instead, only a URL to the object containing the image data
|
||||||
|
is stored, and the API retreives this data when the image is requested.
|
||||||
|
|
||||||
|
Upon first running the API, a new container with the name "fractals" will
|
||||||
|
be created under our account. This container will hold all generated fractal
|
||||||
|
image files.
|
||||||
|
|
||||||
|
OpenStack authentication is currently performed by a pre-generated application
|
||||||
|
credential. In the first attempts we used password authentication directly.
|
||||||
|
In a real deployment, it would be best to instead have the deployment
|
||||||
|
script automatically generate a new set of application credentials for the API
|
||||||
|
and workers to use.
|
||||||
|
|
||||||
|
OpenStack authentication using libcloud was a difficult roadblock. For a long
|
||||||
|
while we were stuck on this issue, as the example given in the documentation
|
||||||
|
did not work for us. We were eventually able to fix this by forcing v3
|
||||||
|
authentication directly and using the base URL instead of the one given
|
||||||
|
by OpenStack. Here is the code example that worked for us:
|
||||||
|
|
||||||
|
```
|
||||||
|
from libcloud.storage.types import Provider
|
||||||
|
from libcloud.storage.providers import get_driver
|
||||||
|
import libcloud.security
|
||||||
|
libcloud.security.VERIFY_SSL_CERT = False
|
||||||
|
|
||||||
|
cls = get_driver(Provider.OPENSTACK_SWIFT)
|
||||||
|
# Use these parameters for v3 authentication
|
||||||
|
driver = cls(
|
||||||
|
'CloudComp2', # username
|
||||||
|
'demo', # password
|
||||||
|
ex_force_auth_url='https://10.32.4.29:5000/', # NOT https://10.32.4.29:5000/v3
|
||||||
|
ex_force_auth_version='3.x_password', # '3.x_appcred' for application credentials
|
||||||
|
ex_tenant_name='CloudComp2',
|
||||||
|
ex_domain_name='default'
|
||||||
|
)
|
||||||
|
|
||||||
|
print(driver.list_containers())
|
||||||
|
```
|
||||||
|
|
||||||
|
This code sample uses username and password directly for authentication.
|
||||||
|
Our submitted faafo application instead uses application credentials to
|
||||||
|
authenticate. In this case we had to change ex_force_auth_version to
|
||||||
|
'3.x_appcred'.
|
||||||
|
|
||||||
|
We tried deploying using the demo2, demo3, demo4-1 and demo4-2 deployment scripts.
|
||||||
|
All of these deployments were successful and made use of OpenStack object storage
|
||||||
|
correctly, showing that the application in its current state is scalable.
|
||||||
|
|
||||||
|
List of changed files for task 2:
|
||||||
|
- faafo/faafo/api/service.py
|
||||||
|
- faafo/faafo/worker/service.py
|
||||||
|
|
||||||
|
|
||||||
|
A more detailed breakdown of what exact changes were made to which file can
|
||||||
|
be found in [our git repository history](https://git.emin.software/haxala1r/cloud-computing-msc-ai-examples).
|
@@ -30,7 +30,7 @@ libcloud.security.CA_CERTS_PATH = ['./root-ca.crt']
|
|||||||
# Please use 1-29 for 0 in the following variable to specify your group number.
|
# Please use 1-29 for 0 in the following variable to specify your group number.
|
||||||
# (will be used for the username, project etc., as coordinated in the lab sessions)
|
# (will be used for the username, project etc., as coordinated in the lab sessions)
|
||||||
|
|
||||||
GROUP_NUMBER = 0
|
GROUP_NUMBER = 2
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
@@ -177,7 +177,7 @@ def main(): # noqa: C901 pylint: disable=too-many-branches,too-many-statements,
|
|||||||
###########################################################################
|
###########################################################################
|
||||||
|
|
||||||
# new repo on git-ce.rwth-aachen.de:
|
# new repo on git-ce.rwth-aachen.de:
|
||||||
hsfd_faafo_cloud_init_script = 'https://git-ce.rwth-aachen.de/sebastian.rieger/cloud-computing-msc-ai-examples/-/raw/master/faafo/contrib/install.sh'
|
hsfd_faafo_cloud_init_script = 'https://git.emin.software/haxala1r/cloud-computing-msc-ai-examples/raw/branch/main/faafo/contrib/install.sh'
|
||||||
|
|
||||||
userdata = '#!/usr/bin/env bash\n' \
|
userdata = '#!/usr/bin/env bash\n' \
|
||||||
f'curl -L -s {hsfd_faafo_cloud_init_script} | bash -s -- ' \
|
f'curl -L -s {hsfd_faafo_cloud_init_script} | bash -s -- ' \
|
||||||
|
@@ -193,7 +193,7 @@ def main(): # noqa: C901 pylint: disable=too-many-branches,too-many-statements,
|
|||||||
###########################################################################
|
###########################################################################
|
||||||
|
|
||||||
# new repo on git-ce.rwth-aachen.de:
|
# new repo on git-ce.rwth-aachen.de:
|
||||||
hsfd_faafo_cloud_init_script = 'https://git-ce.rwth-aachen.de/sebastian.rieger/cloud-computing-msc-ai-examples/-/raw/master/faafo/contrib/install.sh'
|
hsfd_faafo_cloud_init_script = 'https://git.emin.software/haxala1r/cloud-computing-msc-ai-examples/raw/branch/main/faafo/contrib/install.sh'
|
||||||
|
|
||||||
userdata = '#!/usr/bin/env bash\n' \
|
userdata = '#!/usr/bin/env bash\n' \
|
||||||
f'curl -L -s {hsfd_faafo_cloud_init_script} | bash -s -- ' \
|
f'curl -L -s {hsfd_faafo_cloud_init_script} | bash -s -- ' \
|
||||||
|
@@ -251,7 +251,7 @@ def main(): # noqa: C901 pylint: disable=too-many-branches,too-many-statements,
|
|||||||
###########################################################################
|
###########################################################################
|
||||||
|
|
||||||
# new repo on git-ce.rwth-aachen.de:
|
# new repo on git-ce.rwth-aachen.de:
|
||||||
hsfd_faafo_cloud_init_script = 'https://git-ce.rwth-aachen.de/sebastian.rieger/cloud-computing-msc-ai-examples/-/raw/master/faafo/contrib/install.sh'
|
hsfd_faafo_cloud_init_script = 'https://git.emin.software/haxala1r/cloud-computing-msc-ai-examples/raw/branch/main/faafo/contrib/install.sh'
|
||||||
|
|
||||||
userdata_service = '#!/usr/bin/env bash\n' \
|
userdata_service = '#!/usr/bin/env bash\n' \
|
||||||
f'curl -L -s {hsfd_faafo_cloud_init_script} | bash -s -- ' \
|
f'curl -L -s {hsfd_faafo_cloud_init_script} | bash -s -- ' \
|
||||||
|
@@ -190,7 +190,7 @@ def main(): # noqa: C901 pylint: disable=too-many-branches,too-many-statements,
|
|||||||
###########################################################################
|
###########################################################################
|
||||||
|
|
||||||
# new repo on git-ce.rwth-aachen.de:
|
# new repo on git-ce.rwth-aachen.de:
|
||||||
hsfd_faafo_cloud_init_script = 'https://git-ce.rwth-aachen.de/sebastian.rieger/cloud-computing-msc-ai-examples/-/raw/master/faafo/contrib/install.sh'
|
hsfd_faafo_cloud_init_script = 'https://git.emin.software/haxala1r/cloud-computing-msc-ai-examples/raw/branch/main/faafo/contrib/install.sh'
|
||||||
|
|
||||||
userdata_worker = '#!/usr/bin/env bash\n' \
|
userdata_worker = '#!/usr/bin/env bash\n' \
|
||||||
f'curl -L -s {hsfd_faafo_cloud_init_script} | bash -s -- ' \
|
f'curl -L -s {hsfd_faafo_cloud_init_script} | bash -s -- ' \
|
||||||
|
@@ -172,6 +172,25 @@ def do_delete_fractal():
|
|||||||
headers=headers)
|
headers=headers)
|
||||||
LOG.debug("result: %s" % result.text)
|
LOG.debug("result: %s" % result.text)
|
||||||
|
|
||||||
|
def do_delete_all():
|
||||||
|
LOG.info("deleting everything")
|
||||||
|
fractals = get_fractals()
|
||||||
|
for f in fractals:
|
||||||
|
fractal_data = f["attributes"]
|
||||||
|
uuid = fractal_data["uuid"]
|
||||||
|
|
||||||
|
CONF.command.uuid = uuid
|
||||||
|
do_delete_fractal()
|
||||||
|
def do_delete_slected_fractal():
|
||||||
|
LOG.info("deleting given fractals %s" % CONF.command.uuid)
|
||||||
|
raw_list= CONF.command.uuid.split(",")
|
||||||
|
for uuid in raw_list:
|
||||||
|
headers = {'Content-Type': 'application/vnd.api+json',
|
||||||
|
'Accept': 'application/vnd.api+json'}
|
||||||
|
result = requests.delete("%s/v1/fractal/%s" %
|
||||||
|
(CONF.endpoint_url, uuid),
|
||||||
|
headers=headers)
|
||||||
|
LOG.debug("result: %s" % result.text)
|
||||||
|
|
||||||
def do_create_fractal():
|
def do_create_fractal():
|
||||||
random.seed()
|
random.seed()
|
||||||
@@ -246,6 +265,15 @@ def add_command_parsers(subparsers):
|
|||||||
parser = subparsers.add_parser('delete')
|
parser = subparsers.add_parser('delete')
|
||||||
parser.set_defaults(func=do_delete_fractal)
|
parser.set_defaults(func=do_delete_fractal)
|
||||||
parser.add_argument("uuid", help="Fractal to delete.")
|
parser.add_argument("uuid", help="Fractal to delete.")
|
||||||
|
|
||||||
|
## no arguments
|
||||||
|
parser = subparsers.add_parser('delete-all')
|
||||||
|
parser.set_defaults(func=do_delete_all)
|
||||||
|
|
||||||
|
## it takes a list of uuids separated by commas
|
||||||
|
parser = subparsers.add_parser('delete-selected')
|
||||||
|
parser.set_defaults(func=do_delete_slected_fractal)
|
||||||
|
parser.add_argument("uuid", help="deleting the selected fractals. enter the uuid separated by commas.")
|
||||||
|
|
||||||
parser = subparsers.add_parser('show')
|
parser = subparsers.add_parser('show')
|
||||||
parser.set_defaults(func=do_show_fractal)
|
parser.set_defaults(func=do_show_fractal)
|
||||||
|
@@ -151,8 +151,8 @@ if [[ -e /etc/os-release ]]; then
|
|||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# HSFD changed to git-ce.rwth-aachen.de repo
|
# use emin's gitea instance
|
||||||
git clone https://git-ce.rwth-aachen.de/sebastian.rieger/cloud-computing-msc-ai-examples.git
|
git clone https://git.emin.software/haxala1r/cloud-computing-msc-ai-examples.git
|
||||||
cd cloud-computing-msc-ai-examples/faafo
|
cd cloud-computing-msc-ai-examples/faafo
|
||||||
# following line required by bug 1636150
|
# following line required by bug 1636150
|
||||||
sudo pip3 install --upgrade pbr
|
sudo pip3 install --upgrade pbr
|
||||||
|
@@ -151,8 +151,8 @@ if [[ -e /etc/os-release ]]; then
|
|||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# HSFD changed to git-ce.rwth-aachen.de repo
|
# URL changed to use emin's gitea instance
|
||||||
git clone https://git-ce.rwth-aachen.de/sebastian.rieger/cloud-computing-msc-ai-examples.git
|
git clone https://git.emin.software/haxala1r/cloud-computing-msc-ai-examples.git
|
||||||
cd cloud-computing-msc-ai-examples/faafo
|
cd cloud-computing-msc-ai-examples/faafo
|
||||||
# following line required by bug 1636150
|
# following line required by bug 1636150
|
||||||
sudo pip install --upgrade pbr
|
sudo pip install --upgrade pbr
|
||||||
|
@@ -12,11 +12,14 @@
|
|||||||
|
|
||||||
import base64
|
import base64
|
||||||
import copy
|
import copy
|
||||||
|
import hashlib
|
||||||
import io
|
import io
|
||||||
import socket
|
import socket
|
||||||
|
import uuid
|
||||||
from pkg_resources import resource_filename
|
from pkg_resources import resource_filename
|
||||||
|
|
||||||
import flask
|
import flask
|
||||||
|
from flask import request
|
||||||
from flask_restless import APIManager
|
from flask_restless import APIManager
|
||||||
from flask_sqlalchemy import SQLAlchemy
|
from flask_sqlalchemy import SQLAlchemy
|
||||||
from flask_bootstrap import Bootstrap
|
from flask_bootstrap import Bootstrap
|
||||||
@@ -25,11 +28,18 @@ from kombu.pools import producers
|
|||||||
from oslo_config import cfg
|
from oslo_config import cfg
|
||||||
from oslo_log import log
|
from oslo_log import log
|
||||||
from PIL import Image
|
from PIL import Image
|
||||||
from sqlalchemy.dialects import mysql
|
|
||||||
|
|
||||||
from faafo import queues
|
from faafo import queues
|
||||||
from faafo import version
|
from faafo import version
|
||||||
|
|
||||||
|
from libcloud.storage.types import Provider
|
||||||
|
from libcloud.storage.providers import get_driver
|
||||||
|
import libcloud.security
|
||||||
|
|
||||||
|
# Disable SSL verification.
|
||||||
|
# It would be better to add the certificate later.
|
||||||
|
libcloud.security.VERIFY_SSL_CERT = False
|
||||||
|
|
||||||
LOG = log.getLogger('faafo.api')
|
LOG = log.getLogger('faafo.api')
|
||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
|
|
||||||
@@ -57,6 +67,24 @@ CONF(project='api', prog='faafo-api',
|
|||||||
log.setup(CONF, 'api',
|
log.setup(CONF, 'api',
|
||||||
version=version.version_info.version_string())
|
version=version.version_info.version_string())
|
||||||
|
|
||||||
|
# Initialize Swift driver
|
||||||
|
Swift = get_driver(Provider.OPENSTACK_SWIFT)
|
||||||
|
driver = Swift(
|
||||||
|
'CloudComp2',
|
||||||
|
'demo',
|
||||||
|
ex_force_auth_url='https://10.32.4.29:5000/',
|
||||||
|
ex_force_auth_version='3.x_password',
|
||||||
|
ex_tenant_name='CloudComp2',
|
||||||
|
ex_domain_name='default',
|
||||||
|
)
|
||||||
|
|
||||||
|
# Ensure container exists
|
||||||
|
try:
|
||||||
|
container = driver.get_container(container_name='fractals')
|
||||||
|
except:
|
||||||
|
# Create container if it doesn't exist
|
||||||
|
container = driver.create_container(container_name='fractals')
|
||||||
|
|
||||||
template_path = resource_filename(__name__, "templates")
|
template_path = resource_filename(__name__, "templates")
|
||||||
app = flask.Flask('faafo.api', template_folder=template_path)
|
app = flask.Flask('faafo.api', template_folder=template_path)
|
||||||
app.config['DEBUG'] = CONF.debug
|
app.config['DEBUG'] = CONF.debug
|
||||||
@@ -73,10 +101,10 @@ def list_opts():
|
|||||||
return [(None, copy.deepcopy(api_opts))]
|
return [(None, copy.deepcopy(api_opts))]
|
||||||
|
|
||||||
|
|
||||||
class Fractal(db.Model):
|
class Fractal(db.Model):
|
||||||
uuid = db.Column(db.String(36), primary_key=True)
|
uuid = db.Column(db.String(36), primary_key=True)
|
||||||
checksum = db.Column(db.String(256), unique=True)
|
checksum = db.Column(db.String(256), unique=True)
|
||||||
url = db.Column(db.String(256), nullable=True)
|
url = db.Column(db.String(256), nullable=True) # Stores Swift object name/path
|
||||||
duration = db.Column(db.Float)
|
duration = db.Column(db.Float)
|
||||||
size = db.Column(db.Integer, nullable=True)
|
size = db.Column(db.Integer, nullable=True)
|
||||||
width = db.Column(db.Integer, nullable=False)
|
width = db.Column(db.Integer, nullable=False)
|
||||||
@@ -86,13 +114,6 @@ class Fractal(db.Model):
|
|||||||
xb = db.Column(db.Float, nullable=False)
|
xb = db.Column(db.Float, nullable=False)
|
||||||
ya = db.Column(db.Float, nullable=False)
|
ya = db.Column(db.Float, nullable=False)
|
||||||
yb = db.Column(db.Float, nullable=False)
|
yb = db.Column(db.Float, nullable=False)
|
||||||
|
|
||||||
if CONF.database_url.startswith('mysql'):
|
|
||||||
LOG.debug('Using MySQL database backend')
|
|
||||||
image = db.Column(mysql.MEDIUMBLOB, nullable=True)
|
|
||||||
else:
|
|
||||||
image = db.Column(db.LargeBinary, nullable=True)
|
|
||||||
|
|
||||||
generated_by = db.Column(db.String(256), nullable=True)
|
generated_by = db.Column(db.String(256), nullable=True)
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
@@ -106,6 +127,36 @@ manager = APIManager(app=app, session=db.session)
|
|||||||
connection = Connection(CONF.transport_url)
|
connection = Connection(CONF.transport_url)
|
||||||
|
|
||||||
|
|
||||||
|
def upload_image_to_swift(image_bytes, object_name):
|
||||||
|
"""Upload image bytes to Swift storage and return the object name."""
|
||||||
|
try:
|
||||||
|
LOG.debug(f"Uploading image to Swift: {object_name}")
|
||||||
|
obj = driver.upload_object_via_stream(
|
||||||
|
iterator=io.BytesIO(image_bytes),
|
||||||
|
container=container,
|
||||||
|
object_name=object_name
|
||||||
|
)
|
||||||
|
LOG.debug(f"Successfully uploaded {object_name} to Swift")
|
||||||
|
return object_name
|
||||||
|
except Exception as e:
|
||||||
|
LOG.error(f"Failed to upload image to Swift: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
def download_image_from_swift(object_name):
|
||||||
|
"""Download image from Swift storage."""
|
||||||
|
try:
|
||||||
|
LOG.debug(f"Downloading image from Swift: {object_name}")
|
||||||
|
obj = driver.get_object(container_name='fractals', object_name=object_name)
|
||||||
|
stream = driver.download_object_as_stream(obj)
|
||||||
|
image_data = b''.join(stream)
|
||||||
|
LOG.debug(f"Successfully downloaded {object_name} from Swift")
|
||||||
|
return image_data
|
||||||
|
except Exception as e:
|
||||||
|
LOG.error(f"Failed to download image from Swift: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
@app.route('/', methods=['GET'])
|
@app.route('/', methods=['GET'])
|
||||||
@app.route('/index', methods=['GET'])
|
@app.route('/index', methods=['GET'])
|
||||||
@app.route('/index/<int:page>', methods=['GET'])
|
@app.route('/index/<int:page>', methods=['GET'])
|
||||||
@@ -120,20 +171,17 @@ def index(page=1):
|
|||||||
@app.route('/fractal/<string:fractalid>', methods=['GET'])
|
@app.route('/fractal/<string:fractalid>', methods=['GET'])
|
||||||
def get_fractal(fractalid):
|
def get_fractal(fractalid):
|
||||||
fractal = Fractal.query.filter_by(uuid=fractalid).first()
|
fractal = Fractal.query.filter_by(uuid=fractalid).first()
|
||||||
if not fractal:
|
if not fractal or not fractal.url:
|
||||||
response = flask.jsonify({'code': 404,
|
return flask.jsonify({'code': 404, 'message': 'Fractal not found'}), 404
|
||||||
'message': 'Fracal not found'})
|
|
||||||
response.status_code = 404
|
|
||||||
else:
|
|
||||||
image_data = base64.b64decode(fractal.image)
|
|
||||||
image = Image.open(io.BytesIO(image_data))
|
|
||||||
output = io.BytesIO()
|
|
||||||
image.save(output, "PNG")
|
|
||||||
image.seek(0)
|
|
||||||
response = flask.make_response(output.getvalue())
|
|
||||||
response.content_type = "image/png"
|
|
||||||
|
|
||||||
return response
|
try:
|
||||||
|
image_data = download_image_from_swift(fractal.url)
|
||||||
|
response = flask.make_response(image_data)
|
||||||
|
response.content_type = "image/png"
|
||||||
|
return response
|
||||||
|
except Exception as e:
|
||||||
|
LOG.error(f"Error retrieving fractal {fractalid}: {e}")
|
||||||
|
return flask.jsonify({'code': 500, 'message': 'Error retrieving fractal'}), 500
|
||||||
|
|
||||||
|
|
||||||
def generate_fractal(**kwargs):
|
def generate_fractal(**kwargs):
|
||||||
@@ -147,15 +195,41 @@ def generate_fractal(**kwargs):
|
|||||||
|
|
||||||
|
|
||||||
def convert_image_to_binary(**kwargs):
|
def convert_image_to_binary(**kwargs):
|
||||||
|
"""Process the image data from worker and upload to Swift."""
|
||||||
LOG.debug("Preprocessor call: " + str(kwargs))
|
LOG.debug("Preprocessor call: " + str(kwargs))
|
||||||
|
|
||||||
if 'image' in kwargs['data']['data']['attributes']:
|
if 'image' in kwargs['data']['data']['attributes']:
|
||||||
LOG.debug("Converting image to binary...")
|
LOG.debug("Processing image for Swift upload...")
|
||||||
kwargs['data']['data']['attributes']['image'] = \
|
|
||||||
str(kwargs['data']['data']['attributes']['image']).encode("ascii")
|
# Get the base64 encoded image from worker
|
||||||
|
image_base64 = kwargs['data']['data']['attributes']['image']
|
||||||
|
image_bytes = base64.b64decode(image_base64)
|
||||||
|
|
||||||
|
# Generate object name using UUID
|
||||||
|
fractal_uuid = kwargs['data']['data']['attributes']['uuid']
|
||||||
|
object_name = f"{fractal_uuid}.png"
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Upload to Swift
|
||||||
|
swift_object_name = upload_image_to_swift(image_bytes, object_name)
|
||||||
|
|
||||||
|
# Update the fractal record with Swift object name instead of binary data
|
||||||
|
kwargs['data']['data']['attributes']['url'] = swift_object_name
|
||||||
|
|
||||||
|
# Remove the binary image data since we're storing in Swift
|
||||||
|
del kwargs['data']['data']['attributes']['image']
|
||||||
|
|
||||||
|
LOG.debug(f"Image uploaded to Swift as {swift_object_name}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
LOG.error(f"Failed to upload image to Swift: {e}")
|
||||||
|
# Keep the binary data as fallback if Swift upload fails
|
||||||
|
kwargs['data']['data']['attributes']['image'] = \
|
||||||
|
str(kwargs['data']['data']['attributes']['image']).encode("ascii")
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
print("Starting API server - new...")
|
print("Starting API server with Swift storage...")
|
||||||
with app.app_context():
|
with app.app_context():
|
||||||
manager.create_api(Fractal, methods=['GET', 'POST', 'DELETE', 'PATCH'],
|
manager.create_api(Fractal, methods=['GET', 'POST', 'DELETE', 'PATCH'],
|
||||||
postprocessors={'POST_RESOURCE': [generate_fractal]},
|
postprocessors={'POST_RESOURCE': [generate_fractal]},
|
||||||
|
@@ -37,11 +37,11 @@ LOG = log.getLogger('faafo.worker')
|
|||||||
CONF = cfg.CONF
|
CONF = cfg.CONF
|
||||||
|
|
||||||
|
|
||||||
worker_opts = {
|
worker_opts = [
|
||||||
cfg.StrOpt('endpoint-url',
|
cfg.StrOpt('endpoint-url',
|
||||||
default='http://localhost',
|
default='http://localhost',
|
||||||
help='API connection URL')
|
help='API connection URL')
|
||||||
}
|
]
|
||||||
|
|
||||||
CONF.register_opts(worker_opts)
|
CONF.register_opts(worker_opts)
|
||||||
|
|
||||||
@@ -84,6 +84,13 @@ class JuliaSet(object):
|
|||||||
self.image.save(fp, "PNG")
|
self.image.save(fp, "PNG")
|
||||||
return fp.name
|
return fp.name
|
||||||
|
|
||||||
|
def get_image_bytes(self):
|
||||||
|
"""Return image as bytes without saving to file."""
|
||||||
|
with tempfile.NamedTemporaryFile() as fp:
|
||||||
|
self.image.save(fp, "PNG")
|
||||||
|
fp.seek(0)
|
||||||
|
return fp.read()
|
||||||
|
|
||||||
def _set_point(self):
|
def _set_point(self):
|
||||||
random.seed()
|
random.seed()
|
||||||
while True:
|
while True:
|
||||||
@@ -116,6 +123,8 @@ class Worker(ConsumerMixin):
|
|||||||
LOG.info("processing task %s" % task['uuid'])
|
LOG.info("processing task %s" % task['uuid'])
|
||||||
LOG.debug(task)
|
LOG.debug(task)
|
||||||
start_time = time.time()
|
start_time = time.time()
|
||||||
|
|
||||||
|
# Generate the fractal
|
||||||
juliaset = JuliaSet(task['width'],
|
juliaset = JuliaSet(task['width'],
|
||||||
task['height'],
|
task['height'],
|
||||||
task['xa'],
|
task['xa'],
|
||||||
@@ -127,16 +136,20 @@ class Worker(ConsumerMixin):
|
|||||||
LOG.info("task %s processed in %f seconds" %
|
LOG.info("task %s processed in %f seconds" %
|
||||||
(task['uuid'], elapsed_time))
|
(task['uuid'], elapsed_time))
|
||||||
|
|
||||||
filename = juliaset.get_file()
|
# Get image as bytes instead of saving to file
|
||||||
LOG.debug("saved result of task %s to temporary file %s" %
|
image_bytes = juliaset.get_image_bytes()
|
||||||
(task['uuid'], filename))
|
size = len(image_bytes)
|
||||||
with open(filename, "rb") as fp:
|
|
||||||
size = os.fstat(fp.fileno()).st_size
|
# Calculate checksum
|
||||||
image = base64.b64encode(fp.read())
|
checksum = hashlib.sha256(image_bytes).hexdigest()
|
||||||
checksum = hashlib.sha256(open(filename, 'rb').read()).hexdigest()
|
|
||||||
os.remove(filename)
|
# Convert to base64 for JSON transmission
|
||||||
LOG.debug("removed temporary file %s" % filename)
|
image_base64 = base64.b64encode(image_bytes).decode("ascii")
|
||||||
|
|
||||||
|
LOG.debug("generated fractal %s, size: %d bytes, checksum: %s" %
|
||||||
|
(task['uuid'], size, checksum))
|
||||||
|
|
||||||
|
# Prepare result for API
|
||||||
result = {
|
result = {
|
||||||
'data': {
|
'data': {
|
||||||
'type': 'fractal',
|
'type': 'fractal',
|
||||||
@@ -144,7 +157,7 @@ class Worker(ConsumerMixin):
|
|||||||
'attributes': {
|
'attributes': {
|
||||||
'uuid': task['uuid'],
|
'uuid': task['uuid'],
|
||||||
'duration': elapsed_time,
|
'duration': elapsed_time,
|
||||||
'image': image.decode("ascii"),
|
'image': image_base64, # This will be processed by API and uploaded to Swift
|
||||||
'checksum': checksum,
|
'checksum': checksum,
|
||||||
'size': size,
|
'size': size,
|
||||||
'generated_by': socket.gethostname()
|
'generated_by': socket.gethostname()
|
||||||
@@ -155,12 +168,22 @@ class Worker(ConsumerMixin):
|
|||||||
headers = {'Content-Type': 'application/vnd.api+json',
|
headers = {'Content-Type': 'application/vnd.api+json',
|
||||||
'Accept': 'application/vnd.api+json'}
|
'Accept': 'application/vnd.api+json'}
|
||||||
|
|
||||||
resp = requests.patch("%s/v1/fractal/%s" %
|
try:
|
||||||
(CONF.endpoint_url, str(task['uuid'])),
|
resp = requests.patch("%s/v1/fractal/%s" %
|
||||||
json.dumps(result),
|
(CONF.endpoint_url, str(task['uuid'])),
|
||||||
headers=headers,
|
json.dumps(result),
|
||||||
timeout=30)
|
headers=headers,
|
||||||
LOG.debug("Result: %s" % resp.text)
|
timeout=30)
|
||||||
|
LOG.debug("API Response: %s" % resp.text)
|
||||||
|
|
||||||
|
if resp.status_code not in [200, 201]:
|
||||||
|
LOG.error("API request failed with status %d: %s" %
|
||||||
|
(resp.status_code, resp.text))
|
||||||
|
else:
|
||||||
|
LOG.info("Successfully uploaded fractal %s to Swift storage" % task['uuid'])
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
LOG.error("Failed to send result to API: %s" % e)
|
||||||
|
|
||||||
message.ack()
|
message.ack()
|
||||||
return result
|
return result
|
@@ -28,4 +28,5 @@ flask-sqlalchemy
|
|||||||
oslo.config
|
oslo.config
|
||||||
oslo.log
|
oslo.log
|
||||||
PrettyTable
|
PrettyTable
|
||||||
kombu
|
kombu
|
||||||
|
apache-libcloud
|
||||||
|
Reference in New Issue
Block a user