Skip to content
This repository has been archived by the owner on Dec 21, 2023. It is now read-only.

Solr data should be optional #40

Open
jameswilson opened this issue Dec 17, 2018 · 2 comments
Open

Solr data should be optional #40

jameswilson opened this issue Dec 17, 2018 · 2 comments

Comments

@jameswilson
Copy link
Member

jameswilson commented Dec 17, 2018

Currently it is impossible to initialize the Solr container unless .spark/solr-data.tar.gz is present in the repository.

We need to be able to initialize without importing default content.

$ composer spark containers:start solr
> SPARK_WORKDIR=`pwd` robo --ansi --load-from vendor/bluesparklabs/spark 'containers:start' 'solr'

 Spark ✨ IULD8 — Starting containers…                                           

 [Droath\RoboDockerCompose\Task\Up] Running Docker-Compose: /usr/local/bin/docker-compose  --file ./docker/docker-compose.drupal8.yml --project-name IULD8 up  -d solr
 [Droath\RoboDockerCompose\Task\Up] Running /usr/local/bin/docker-compose  --file ./docker/docker-compose.drupal8.yml --project-name IULD8 up  -d solr
Creating network "iuld8_default" with the default driver
Creating iuld8_solr_1_9321d9ca79d7 ... done
 [Droath\RoboDockerCompose\Task\Up] Done in 1.587s


$ composer spark solr:init
> SPARK_WORKDIR=`pwd` robo --ansi --load-from vendor/bluesparklabs/spark 'solr:init'

 Spark ✨ IULD8 — Creating Solr core…                                            

 [Exec] Running composer run -d /path/to/some/project spark containers:exec 'solr' '/bin/sh -c "make init -f /usr/local/bin/actions.mk; tar xfz /opt/spark-project/.spark/solr-data.tar.gz -C /opt/solr/server/solr/default; make reload core=default -f /usr/local/bin/actions.mk"'
> SPARK_WORKDIR=`pwd` robo --ansi --load-from vendor/bluesparklabs/spark 'containers:exec' 'solr' '/bin/sh -c "make init -f /usr/local/bin/actions.mk; tar xfz /opt/spark-project/.spark/solr-data.tar.gz -C /opt/solr/server/solr/default; make reload core=default -f /usr/local/bin/actions.mk"'

 Spark ✨ IULD8 — Executing in container: solr                                   

 [Droath\RoboDockerCompose\Task\Execute] Running Docker-Compose: /usr/local/bin/docker-compose  --file ./docker/docker-compose.drupal8.yml --project-name IULD8 exec  -T  solr /bin/sh -c "make init -f /usr/local/bin/actions.mk; tar xfz /opt/spark-project/.spark/solr-data.tar.gz -C /opt/solr/server/solr/default; make reload core=default -f /usr/local/bin/actions.mk"
 [Droath\RoboDockerCompose\Task\Execute] Running /usr/local/bin/docker-compose  --file ./docker/docker-compose.drupal8.yml --project-name IULD8 exec  -T  solr /bin/sh -c "make init -f /usr/local/bin/actions.mk; tar xfz /opt/spark-project/.spark/solr-data.tar.gz -C /opt/solr/server/solr/default; make reload core=default -f /usr/local/bin/actions.mk"
init_solr localhost
No solr cores found, creating a default core
make[1]: Entering directory '/opt/solr/server/solr'
echo "Creating core default from config set drupal"
Creating core default from config set drupal
curl -sIN "http://localhost:8983/solr/admin/cores?action=CREATE&name=default&configSet=drupal&instanceDir=default" \
	| head -n 1 | awk '{print $2}' | grep -q 200
make[1]: *** [/usr/local/bin/actions.mk:32: create] Error 1
make[1]: Leaving directory '/opt/solr/server/solr'
make: *** [/usr/local/bin/actions.mk:25: init] Error 2
tar (child): /opt/spark-project/.spark/solr-data.tar.gz: Cannot open: No such file or directory
tar (child): Error is not recoverable: exiting now
tar: Child returned status 2
tar: Error is not recoverable: exiting now
echo "Reloading core default"
Reloading core default
curl -sIN "http://localhost:8983/solr/admin/cores?action=RELOAD&core=default" \
	| head -n 1 | awk '{print $2}' | grep -q 200
make: *** [/usr/local/bin/actions.mk:44: reload] Error 1
 [Droath\RoboDockerCompose\Task\Execute]  init_solr localhost
No solr cores found, creating a default core
make[1]: Entering directory '/opt/solr/server/solr'
echo "Creating core default from config set drupal"
Creating core default from config set drupal
curl -sIN "http://localhost:8983/solr/admin/cores?action=CREATE&name=default&configSet=drupal&instanceDir=default" \
	| head -n 1 | awk '{print $2}' | grep -q 200
make[1]: Leaving directory '/opt/solr/server/solr'
echo "Reloading core default"
Reloading core default
curl -sIN "http://localhost:8983/solr/admin/cores?action=RELOAD&core=default" \
	| head -n 1 | awk '{print $2}' | grep -q 200
  Time 0.972s
 [Droath\RoboDockerCompose\Task\Execute]  Exit code 2  Time 0.972s
 [notice] Stopping on fail. Exiting....
 [error]  Exit Code: 2 
 [error]    in task Droath\RoboDockerCompose\Task\Execute 

  init_solr localhost
No solr cores found, creating a default core
make[1]: Entering directory '/opt/solr/server/solr'
echo "Creating core default from config set drupal"
Creating core default from config set drupal
curl -sIN "http://localhost:8983/solr/admin/cores?action=CREATE&name=default&configSet=drupal&instanceDir=default" \
	| head -n 1 | awk '{print $2}' | grep -q 200
make[1]: Leaving directory '/opt/solr/server/solr'
echo "Reloading core default"
Reloading core default
curl -sIN "http://localhost:8983/solr/admin/cores?action=RELOAD&core=default" \
	| head -n 1 | awk '{print $2}' | grep -q 200
 
Script SPARK_WORKDIR=`pwd` robo --ansi --load-from vendor/bluesparklabs/spark handling the spark event returned with error code 2
 [Exec]  Exit code 2  Time 1.701s
 [notice] Stopping on fail. Exiting....
 [error]  Exit Code: 2 
 [error]    in task Robo\Task\Base\Exec 

  
 Spark ✨ IULD8 — Executing in container: solr                                   

init_solr localhost
No solr cores found, creating a default core
make[1]: Entering directory '/opt/solr/server/solr'
echo "Creating core default from config set drupal"
Creating core default from config set drupal
curl -sIN "http://localhost:8983/solr/admin/cores?action=CREATE&name=default&configSet=drupal&instanceDir=default" \
	| head -n 1 | awk '{print $2}' | grep -q 200
make[1]: *** [/usr/local/bin/actions.mk:32: create] Error 1
make[1]: Leaving directory '/opt/solr/server/solr'
make: *** [/usr/local/bin/actions.mk:25: init] Error 2
tar (child): /opt/spark-project/.spark/solr-data.tar.gz: Cannot open: No such file or directory
tar (child): Error is not recoverable: exiting now
tar: Child returned status 2
tar: Error is not recoverable: exiting now
echo "Reloading core default"
Reloading core default
curl -sIN "http://localhost:8983/solr/admin/cores?action=RELOAD&core=default" \
	| head -n 1 | awk '{print $2}' | grep -q 200
make: *** [/usr/local/bin/actions.mk:44: reload] Error 1
 
Script SPARK_WORKDIR=`pwd` robo --ansi --load-from vendor/bluesparklabs/spark handling the spark event returned with error code 2
@jameswilson
Copy link
Member Author

jameswilson commented Dec 17, 2018

Looks like we could just put a bash conditional around this command:

'tar xfz /opt/spark-project/.spark/solr-data.tar.gz -C /opt/solr/server/solr/default',

@jameswilson
Copy link
Member Author

I found a workaround, but this sucks to have to do each time:

$ docker images
REPOSITORY          TAG                 IMAGE ID            CREATED             SIZE
iuld8_solr          latest              cc5a5023943b        2 months ago        273MB
wodby/drupal-solr   8-6.6-2.4.0         178d2365983f        8 months ago        273MB

$ docker rmi cc5a5023943b
Untagged: iuld8_solr:latest


$ composer spark solr:init
> SPARK_WORKDIR=`pwd` robo --ansi --load-from vendor/bluesparklabs/spark 'solr:init'

 Spark ✨ IULD8 — Creating Solr core…                                            

 [Exec] Running composer run -d /path/to/some/project spark containers:exec 'solr' '/bin/sh -c "make init -f /usr/local/bin/actions.mk; tar xfz /opt/spark-project/.spark/solr-data.tar.gz -C /opt/solr/server/solr/default; make reload core=default -f /usr/local/bin/actions.mk"'
> SPARK_WORKDIR=`pwd` robo --ansi --load-from vendor/bluesparklabs/spark 'containers:exec' 'solr' '/bin/sh -c "make init -f /usr/local/bin/actions.mk; tar xfz /opt/spark-project/.spark/solr-data.tar.gz -C /opt/solr/server/solr/default; make reload core=default -f /usr/local/bin/actions.mk"'

 Spark ✨ IULD8 — Executing in container: solr                                   

 [Droath\RoboDockerCompose\Task\Execute] Running Docker-Compose: /usr/local/bin/docker-compose  --file ./docker/docker-compose.drupal8.yml --project-name IULD8 exec  -T  solr /bin/sh -c "make init -f /usr/local/bin/actions.mk; tar xfz /opt/spark-project/.spark/solr-data.tar.gz -C /opt/solr/server/solr/default; make reload core=default -f /usr/local/bin/actions.mk"
 [Droath\RoboDockerCompose\Task\Execute] Running /usr/local/bin/docker-compose  --file ./docker/docker-compose.drupal8.yml --project-name IULD8 exec  -T  solr /bin/sh -c "make init -f /usr/local/bin/actions.mk; tar xfz /opt/spark-project/.spark/solr-data.tar.gz -C /opt/solr/server/solr/default; make reload core=default -f /usr/local/bin/actions.mk"
No container found for solr_1
 [Droath\RoboDockerCompose\Task\Execute]    Time 0.555s
 [Droath\RoboDockerCompose\Task\Execute]  Exit code 1  Time 0.555s
 [notice] Stopping on fail. Exiting....
 [error]  Exit Code: 1 
 [error]    in task Droath\RoboDockerCompose\Task\Execute 

   
Script SPARK_WORKDIR=`pwd` robo --ansi --load-from vendor/bluesparklabs/spark handling the spark event returned with error code 1
 [Exec]  Exit code 1  Time 1.199s
 [notice] Stopping on fail. Exiting....
 [error]  Exit Code: 1 
 [error]    in task Robo\Task\Base\Exec 

  
 Spark ✨ IULD8 — Executing in container: solr                                   

No container found for solr_1
 
Script SPARK_WORKDIR=`pwd` robo --ansi --load-from vendor/bluesparklabs/spark handling the spark event returned with error code 1

jameswilson@Makak ~/App/IULD8 (master=) 
$ composer spark containers:start solr
> SPARK_WORKDIR=`pwd` robo --ansi --load-from vendor/bluesparklabs/spark 'containers:start' 'solr'

 Spark ✨ IULD8 — Starting containers…                                           

 [Droath\RoboDockerCompose\Task\Up] Running Docker-Compose: /usr/local/bin/docker-compose  --file ./docker/docker-compose.drupal8.yml --project-name IULD8 up  -d solr
 [Droath\RoboDockerCompose\Task\Up] Running /usr/local/bin/docker-compose  --file ./docker/docker-compose.drupal8.yml --project-name IULD8 up  -d solr
Creating network "iuld8_default" with the default driver
Building solr
Step 1/4 : FROM wodby/drupal-solr:8-6.6-2.4.0
 ---> 178d2365983f
Step 2/4 : USER root
 ---> Running in cc754f912d5a
Removing intermediate container cc754f912d5a
 ---> 37c49e178563
Step 3/4 : RUN mkdir -p /opt/spark-project
 ---> Running in 576ea5712416
Removing intermediate container 576ea5712416
 ---> adc28ce0ea44
Step 4/4 : USER $SOLR_USER
 ---> Running in 2e20bc333075
Removing intermediate container 2e20bc333075
 ---> 0e2d171adb8b
Successfully built 0e2d171adb8b
Successfully tagged iuld8_solr:latest
Image for service solr was built because it did not already exist. To rebuild this image you must use `docker-compose build` or `docker-compose up --build`.
Creating iuld8_solr_1_625167647a78 ... done
 [Droath\RoboDockerCompose\Task\Up] Done in 4.435s

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant