Does GitBook created a merge request before updating Gitlab? - integration

We would like to adopt GitBook to create our documentation, and integrate it with our GitLab repo. I understand the integration is bi-directional, but I want to find out whether, when we update GitBook, the new information is pushed to our Gitlab repo automatically or we first get a merge request.

Related

How to update ReadTheDocs Project Name and URL

I'm working on a Python package where we decided to update the package's name after an initial release of the package and documentation under the old name.
Logging into the associated ReadTheDocs (RTD) account, I'm able to navigate to the projectand change the name from oldproject to newproject. That changed the display name in the "Project Dashboard" in the RTD account but doesn't affect the associated URL for the RTD build.
The project's RTD URL was initially oldproject.readthedocs.io/en/latest/index.html. I'm hoping we can update it to be newproject.readthedocs.io/en/latest/index.html. Can anyone point me in the right direction to update the URL, but retain our prior documentation build history?
It's recommended that you create a new project. You can delete the old project or perhaps create a text explaining that it's been renamed.
See: https://docs.readthedocs.io/en/stable/faq.html#how-do-i-change-my-project-slug-the-url-your-docs-are-served-at

Update a github repository secret from a github action

I have a website fetching the facebook events of one of my webpage thanks to a ruby script.
The script is executed within a github action before the build.
Unfortunately, the facebook token has a limited validity. I managed to find a way to renew it but I'm wondering if it is possible to update my FACEBOOK_TOKEN repository secret from a Github action ?
Of course I'm open to alternative like finding a way to have a permanent token!

Push file to another repository from another organization with github action

I own two GitHub accounts and would like to push one file from accountA/repoA to accountB/repoB in an automatic and periodic fashion through GitHub action.
I have come across some tutorials but they seem to be geared towards repositories in the same organization.
A simple simple script example or tutorial would be very useful.
The general idea would be an action like andstor/copycat-action.
That would require a Personal Access Token with access to org2/repo2
You schedule it on org1/repo1
You configure it to push a file to org2/repo2

pull api, create file, push to netlify with Gitlab CI

So I have a Gitlab CI that will, currently, queue up every Monday at 6am and run. All it does is push a build command to Netlify using a build hook.
My current set up is Gitlab for VC, Netlify for deployment, HUGO as my static site generator.
No problem.
Here's what I'm trying to do: I have access to an API that shows me all of the items on a particular website (podcast) as JSON.
I want to write a Gitlab CI that will fetch the API, grab the newest thing, and then create a new page with hugo new content/{title}.md, and fill that file's front matter with something from the JSON object.
I'm not even sure this is possible, or that this is the best route to go.
But basically, every time I upload a new podcast, I want Gitlab and Netlify to rebuild my website with a dedicated page for that episode.
The easiest route is to parse the JSON with javascript and NOT create a seperate page, but I guess you figured that out already.
The way you describe is also possible. Any server-side script can fetch the API and run the hugo new command (as long as it is on the same/deployment server). I would do it in PHP on my server, but I am kind of old-fashioned. Gitlab CI can probably do this too, but I have never tried it.
You did not really ask a question, but did I answer it anyway?

Openshift github status result

I've managed to get openshift to download my github commits and fire on a webhook without issue. What I'd love to be able to do though is make use of the Github Status API to be able to mark builds as good or bad.
Has anyone had any success in doing this? If so how do you do it? I was thinking of doing it via the postBuild hook in openshift however I don't think I have access to the SHA nor would I be albe to post on failure.
The OPENSHIFT_BUILD_COMMIT environment variable, along with a few others, will be set in the image and provide details about the remote repo used.
https://docs.openshift.org/3.9/dev_guide/builds/build_output.html#output-image-environment-variables
You should be able to see those from the hook you run in the image as part of the postCommit hook.