Attempting to get/understand feedback on the submittal/approval process after following the above steps. I waited 48 hours before creating this forum post as not to show too much impatience.

Should I inspect the Pull Requests tab, I see 0 Open 1 Closed which is a bit mis-leading as; Is the pull process closed on the request I made, meaning no further action is/will to be taken or just an ack that the files are in the pipeline, so the
actual pull request is no longer needed?

If I navigate back to the pull request page from two days ago,

I can get to a page that indicates I did something, but unable to see what it is I actually did.

merged commit 14618de into master 2 days ago

so, is this the master folder of my forked copy or the master folder of espruino/EspruinoDocs ?
e.g. Did I goof somewhere and create a pull action into my own /EspruinoDocs copy?

and attempt a pull request, I can only push files which agrees with the 'Writing+Modules' content, not mentioning the word 'push' This then implies, that my process is what was intended to have happened.

If everything is going as planned, about how long is the normal cycle time, as I guess there is a review and sanitizing step in the works, before finalization?

Is there a way to review the actual original pull request?

Where am I able to get feedback on the process at GitHub? . . . . and what/where should I be looking out for?

How long is the red carpet? Black tie and tails? and how large a band should I consider? Black Dyke Mills from the sixties (Jim Shepherd) my favorite. Are horses out of the question during pinning of the OBE and 21 cannon salute?

It looks from your fork https://github.com/sleuthware/EspruinoDo­cs/branches (is that you?) that you created a branch and merged it into your fork. If you clicked new pull request while looking at the master branch (or the one you care about) then it should give you the option of creating a pull request on the main repo.

Once you have done it, the answer is really 'it depends' - simple stuff can get merged within a few minutes, otherwise it can take a few days - weeks if I'm busy.

Looking at the diffs in your case, you'd probably need to remove a few extra files you have there:

the html files

the modules/test*.js files

modules/codeNeopixelEx3.js and it looks like a few other files, that seem to be snippets of code you're using for testing?

Generally people send PRs for small things at once - one module and the docs for it. Since it looks like your PR has 25 files in it, I'll have to find time to go through and figure out what is/isn't needed - if you could try and issue PRs for smaller bits at once (eg. Colors) that makes life a lot easier.

' I'll have to find time to go through and figure out what is/isn't needed '

There are only four code files, (dependencies see NeopixelQuickStart.js) and three doc files. No edits to existing files.

I thought I had it, then it appears to get worse.

I created a new branch Neopixel20181101 and added the seven unique new files, project complete, that I intend to create the request on; of which there are three modules, three corresponding markdowns, and a sample code usage file.

But, when I created the pull request on just that branch, GitHub is attempting to add 67 other files, which weren't part of what I had just uploaded. Was this a result of previously uploading files with similar names to perform remote testing (more below)?

Also, there now is a strange error: 'Merging is blocked'

Neopixel20181101 #462

Open sleuthware wants to merge 74 commits into espruino:master from sleuthware:Neopixel20181101

on the same page:

Merging is blocked

The base branch restricts merging to authorized users. Learn more about protected branches.
You’re not authorized to merge this pull request.

I scrapped completing the pull request.

Time to consider that it appears this fork may be corrupt.

Note: While I had used the \EspruinoDocs\modules folder in my fork to perform testing, that was done as the IDE wasn't allowing mapping of the \modules and \projects folders within a Windows documents folder. My only option to use local module files at all.

As that is now resolved and I'm able to load modules locally now, other than using that area for remote require() module testing, I am at a point considering scrapping the works. I'm truly only out the time I'll need to re-upload files under test.

Should this be considered at this time? e.g. starting over by deleting the entire EspruinoDocs copy and create a new forked copy

After reviewing the above and attempting to clone a copy of EspruinoDocs in a unique new fork, resulted in the inablity to create a pull request into the espruino\EspruinoDocs\ branch. Forking a copy per instructions at Writing+Modules only pulls a copy on top of the current suspect corrupt copy. So, it appears that one may only create a pull request on a forked original copy, which makes sense.

To the software gods, I concede.

Have answered my own suspicion an am deleting the original suspect corrupt forked copy and starting over with a fresh forked copy. Good bye all that litter!

This should make everything easier having only one upload of unique new files to deal with.

Have fun at the conference and that should give me enough time to get a first tutorial done.

I don't understand why there are additional multiple files added, and keeps growing, when I create the pull request though. It is possible this is left over litter from months of upload testing I did a year ago, when I had the local module folder debacle. That is the only thing that makes sense.

Presumably, the pull request is:
from the espruino\EspruinoDocs side ? to allow the branch compare, and to find my forked copy?
As opposed from my online acct side, as it doesn't appear that the espruino source can be linked to?

I will attempt a local copy of GitHub, should this next attempt fall short.

I have seen your keen interest in matrix Neopixel projects over the last year. Phase III of this deployment intends to add user defined effects to the tutorial (underway as we speak) to show off Espruino's talent and capabilities. My Phase I part is keyed in to just Neopixel strips with effects along with user defined content. I hadn't considered matrix layouts and sure could use your input here.

As Gordon is busy until after this weekend, and I don't have available time until then either, this merge is now delayed week(s) in any event.

Would you like to review/test a quick-start demo code file that incorporates those modules; a'la MaBe StopWatch inspiration, to add your expert advice, constructive criticism, comments of any flavor, as to improve the quick-start file and offer suggestions to improvements, omissions etc. to the overall project . . . . before my final upload to the EspruinoDocs master? I have verbose comments included and JSDoc function header .html to go with that.

My overall goal is to include some matrix flavor enhancements, down the road in Phase III, and it just might be that, as you have a 'wealth of insight' (e.g. read into - snippets, code examples, etc) to add here. Honestly, I really don't have an inkling to dink with the matrix stuff, just doesn't impress me at the moment. Would love to hear your take on this objective also.

In github Web UI on remote repo / fork of oa, a merge request (MR) is created from pushed branch
a) to master of fork - .../ao/EspruinoDocs - and also
b) to master of origin - .../espruinoDocs.

MR towards master of fork is merged (can be merged by ao, because ao is the owner if the repo)

MR towards master of origin cannot be merged by ao. @Gordon has to do that because it is 'his' - locked down - repo.

The reason for making MRs from feature branches of fork and not from master branch of fork (after merging of feature branch) is that there may be multiple feature branches in different stages in the fork, but only the desired and completed should go into the MR towards the master of the origin. It may well be that some feature branches never even make it beyond being branches on the fork.

9.a) is not required. The changes will come with the update of the fork into the fork. The changes may be changed because most likely @Gorden will make some changes to the MRed feature branch before merging it into the origin.

(...somehow the post got unintentionally duplicated... so did split it...)

After @Gordon has performed the MR into his .../espruino/espruinoDocs repos - either unchanged or most likely with some changes, everyone else's fork has now to be updated w/ @Gordon 's commits. This will also bring all the new stuff into the fork

I have my way of updating my fork... but there may be better ones. The simple one - delete the fork and forking again works for sure, but only when nothing else is pending in repo, such as other feature branches or MRs of them - branches already pushed, but not made merge requests of yet, or not even done with development yet.

Updating a fork from origin in detail... that's a topic for a next post.

Can we stick with one kind of numbering of the steps? ...so the contributions can augment one other? I'll gladly adjust the list in post #8 with what ever it takes to get it ready for extraction for a tutorial or at least a separate conversation. If an step is 'too big', we can cut it, or add sub items, such as a) b) c)... I also like to add examples

'You cannot delete branches that are associated with open pull requests.'

Although it would have been nice to test if GitHub would stop/block the repo deletion as there was an in process branch PR, we'll have to take their docs at face value. As I couldn't close (the buttons were not illuminated) maybe I could have (then), deleted the branch perhaps, then deleted the repo?

Interesting note. I first renamed (as I didn't want to have to re-upload other project test files) the suspect corrupt fork. I then attempted to create a new fork. GitHub just made a fork copy over the renamed one and not a unique new instance. I'm forced to delete the renamed one first.

from #13 'Can we stick with one kind of numbering of the steps?'

Yes. @allObjects use your numbering for a local install of a GitHub repository. The following with annotations may be used to create documentation for those that wish to use a public online fork. Both advantages and dis-advantages. I left off numbering so as not to conflict with a chosen system.

Note: Formatting this was a pain - indenting not recognized in list - it is what it is . . . .

Using a public online fork - uses button clicks as opposed to manually entering git commands

A a fork is actually not that much different than a plain repository. The only difference is that the fork has an origin where as the repos is the origin, and therefore a change(s) in the fork that should go to the origin has(have) have to go thru process of merge request (MR). A MR can have as much you put into it. The owner of the origin then can either accept the MR as is, apply modifications to the changes to some parts before accepting, accepting a part - called cherry picking or reject the MR all together.

Since the origin itself changes too and the MR when merged changed or partially is also a change versus what the fork 'knows', the fork has to be updated. I promised earlier to post about it... and it has to still stay a promise for now.

To get some understanding of the guts, take a look at this book... http://shop.oreilly.com/product/06369200­22862.do - there are other good - and may be even free - books and tutorials out there... beyond the 10-day free Safari - read online - trial for this Oreilly book.

It lacks some graphics to visualize 'the flow of things' and some basic text or reasoning for why this particular flow.

You can get that though together in you head with the simple fact about how edition id-ing works in git / gitlab / and the like - all based on Linus Thorwald's 'suggestion': the id of a new edition is construed from the previous id / parent id(s) and the new content - ids are also called commits or hashes - the latter hints how the id is constructed: it is actually calculated with an algorithm. I get it, you will say: every child knows the parent(s), and any parent was once a child (except it was the first of something... the empty repository. The ramifications show the ingenuity of this setup... in many aspects, for example:

If you and i have the same id, we know that you and I have the very same content.

There is nothing that can come out of thin air (except the first edition): everything is a very well trackable change to a previous edition (in case of the first thing, the empty repos is that previous thing).

Confidence - I do not need to got a check... and miss a thing because of my imperfectness

Performance - performance - performance... Since same hash means same content, no need for compare, which saves me time... not just for the compare, but I do not need to ship around and waste band with in order to know that it is the same or different and I need only what is different to have a complete pic of a new hash...

Btw, until recently there was only one content and id that together create one and only one, unique hash. You can also say just content and define the content to include the id... just like a data record in a reasonable (indexed) setup of a database, where the id is a part of that record. The existing uniqueness is though more than sufficient for source code - that is verbose, has syntax and berars semantic... A content that would result in the same hash but would be different would be absolutely different in regard of those aspects.

If you re-read 'Interesting note' above, you'll see that I attempted what is described. When attempting to get a new forked instance,
GitHub knew the origins of the renamed forked copy, and just created the new fork as a copy over the renamed one. Did this twice.

@allObjekts
thanks for this complete step by step list. I have been using this for my fork of Espruino but ended up in a large list of changes including merges from the master. So I decides to close that PR's again and some times I asked Gordon to to cherry picking :-(

@MaBe, everyone can create a merge request when it come from a fork of origin. With write access I meant that you can - at least - create branches in origin... does still not mean that you can merge merge requests...