Why Slack is winning

Or the importance of cross platform apps.

I saw a twitter post the other day about how Microsoft bought Yammer for a billion dollars, and how Slack came in and kinda took over what Microsoft thought Yammer could be.

It made me reflect a bit.

I’d thought of using Slack on a few projects I have, but I had never thought of using Yammer.

Why is that?

Slack has a mac App.

Could something as simple as that be it?

Certainly Slack has done quite a few neat things, but really at the end of the day, aren’t slack, facebook, yammer, newsgator, you-name-it all pretty much re-imaginations of the “Bulletin board”?

Lets look at the importance of having ‘native’ apps. (I put native in quotes, because in this case, it’s not really important if the apps is “truely” native- it might have been developed once using a cross platform framework.) Note that everthing in the list below, is solved by having an “App for that”

Let me share some things I don’t like about your web app (This is ANY web app)

I don’t like not being able to start your web app easily.

I don’t like having to sign on to your web app.

I don’t like that when your web app is running, I can’t easily tell which of my running apps is yours, and which is the google page I opened.

I don’t like that I can’t easily bring your web app to the front of my screen. (because of #3)

I don’t like that I accidentally close your app when I close my browser, and have to go through steps 1 and 2 all over.

In other words, if I’m going something every day, the 5 things above will annoy me as a user EVERY DAY and I will seek alternatives.

The original folks that created Yammer made a nice app using adobe air – this was a cross platform app that worked on mac and windows, and I used it when I first was introduced to Yammer.

Then Microsoft Killed it.

The day the Yammer Mac app stopped working on my mac, was the last day I used Yammer. I had actually forgotten that Yammer existed until I saw the tweet that sparked this blogpost.

I’m sure some will argue that there are other reasons Yammer failed to live up to its expectations, and some will argue there are other reasons Slack is so popular right now. I don’t mean to discount those reasons, just felt like sharing that Yammer lost me when the dropped mac support.

Is the Acronym MVC even accurate?

MVC, Short for Model View Controller is a pretty popular acronym in the web development space, but I wonder, if the acronym itself is making it more difficult for beginners?

I know it did for me.

I think a better acronym might have been RCMV.

Let me explain my thoughts…

Whats the first thing you learn in any MVC tutorial? Routing. Where’s the ‘R’outing in MVC? It’s missing.

Routing is not only the first thing you learn, but it’s also the first thing you deal with for a request to make it’s way in and back out of your webserver. Routing comes first, and it’s not even mentioned in MVC.
Lets add the R to MVC. Now while MVCR rolls off the tongue, It still puts the letters in the wrong order. Routing comes first so it should really be RMVC.

So logically, our Acronym should be RC _ _
Controllers typically make use of Models, so in order that gives us RCM _ which leaves View or RCMV

R oute
C ontroller
M odel
V iew

See what we did there? We now have an acronym that not only spells out each piece of how we do things, but does so in order.

I wonder, how much easier would this be for people brand new to Software Development? Would it make it 1% easier? 3%? 5%?

It seems to me that at some point in learning, our brains jump from having to read something every time, to “just knowing it”. The amount of time it takes progress like this varies per person. But one thing is certain, up until the point where a developer “just knows it”, they don’t know it, and have to extend some thinking power and energy EVERY time they come across it.

If RCMV can lower the amount of thinking power needed during that critical early learning stage, isn’t that a good thing?

Here’s my ask:
If you find yourself helping a brand new developer try to understand “MVC” try presenting it to them as RCMV and see if that makes it easier.

I’ve recorded a few training videos here and there. Being a bit of an Audio Nerd, I’m really concerned with sound quality. The room I did my recordings in was small office with lots of hard surfaces and there was a noticeable echo.

As my Audio Engineer friend Bob Demaa would say: “You can take the mic out of the room, but you can’t take the room out of the Mic”

I chose all suppliers based on location – I’m in the Chicago Area, and the Insulation Boards, which are typically the hardest thing to source, are in stock at Fabrication Specialties and not too long of a drive.

Technique:

DIY panels are both easy and cheap – I wanted these to look at clean as possible, while also being as effective and lightweight as possible.

In that blog post, Eric makes an inexpensive frame around the insulation using 1×2 furring strips from Home Depot.

It seemed like a great idea so I did the same thing, with a twist: I put my wooden frame behind the insulation:

In the picture above I have a 4″ thick piece of Rock Wool, with the frame boards laid out.

As I was walking through Home Depot, I wanted to find something inexpensive and easy to work with that I could use on the front size of the rock wool to maintain a sharp edge. It turns out Plastic Drywall edging was perfect! It’s not only lightweight, and easy to cut / work with, but it’s perforated which means I’m not giving up much in the way of acoustic performance around the edges!

I used a pair of Tin snips I had to cut the plastic corners and they worked perfectly.

I wrapped the ceiling panels with white fabric from JoAnn, using a hand stapler:

The hand stapler got a little tiring, and I already owned an air stapler/compressor so I switched over to that after the first one.

I used eye bolts to hang them, here’s a close up of how that worked – the piece you see with the threads was screwed into the ceiling. That was the worst part of this whole project – Tolerances are pretty tight – I had in mind that I would screw the hook all the way to the ceiling for a near flush mount look. My goal was not to see the hooks, which if you think about it, is kinda funny -after all, I’m hanging these ridiculous panels from the ceiling, it’s not like not seeing the hooks is going to increase the “Wife Acceptance Factor” at that point…

Here are a few finished shots of the 4″x2’x2′ panels:

I learned a few things building these that I’d like to pass on:

White fabric is actually kinda hard to make non-transparent – as you can see in the photo, you can kinda see where the drywall edging is underneath. To get around this, Joanne sells a very cheap white fabric called muslin, it might be worth a layer of that first, or better still, pick a color fabric that isn’t white.

The metal hooks I hung with were nearly impossible to get perfectly lined up – if you can stomach seeing the hooks and the eyelets, it’s likely 10x easier to hang them.

The metal hooks I hung them with would rattle when the floor above was walked on. – this eventually either stopped or I don’t notice it anymore, but it could have been easily prevented by using heat shrink tubing on the hooks, or by wrapping them in electrical tape.

Ok on to the wall panels!

For the Wall panels, I used 2×4 x 2 inch thick rock wool.

I again placed the frame behind the material, and I apologize that I don’t have a bunch of pictures from this phase.

In the picture above you see a nearly completed frame, and you’ll notice I have a center support in this one.

In the next one I built, I switched the center supports to two spaced out at 1/3 intervals like this:

I did this so I could hang the frame either vertically or horizontally.

Speaking of hanging, I wanted something that would hang flush to the wall if possible, and I didn’t want to spend $15 per panel on elaborate hardware.

I found some trim that looked like it would work so I nailed about a 1 foot piece to the inside top and side of every 2×4 panel I built.

Again, I don’t have a picture, hopefully this illustration will do:

The Trim piece works on the wall something like this:

Now for the painful part: Did it work? How long did it take? Did I save any money? Was it worth it?

Yes, it worked. Echo / delay in that room is greatly reduced.

It took several weekends to assemble, plus a trip to the insulation supplier about 30 miles away, plus multiple trips to Joanne Fabric and Home Depot.

I built a total of 7 burgundy 2 inch thick 2×4 panels for the walls, plus 4 white 4 inch thick 2×2 panels for the ceiling.

Thats a total of 11 panels. I figure I spent $336 or about $31 a panel.

Was it worth it?

From an end results perspective, absolutely! Acoustically treating that room was the best thing I could have done!

From a cost/time/effort perspective, probably not. I was just talking with a friend today who had outfit his home studio with panels from ATSAcoustics. Those panels are nearly identical to the ones I built above, and only slightly more expensive, with the huge benefit that someone else has figured out all the variables, and done all the work – There is something to be said about having something arrive at your doorstep ready to hang on your wall!

Update June 2015:

The DIY panels have held up well. My wife hijacked my office (shown above) and I needed to outfit a second space. I did a ton of research and settled on AudiMute panels and I couldn’t be happier! There were a lot of things I liked about AudiMute verses the competitors, and I definitely recommend checking them out. DIY is still cheaper, but As you see from the article above, DIY wasn’t super easy either! I can’t tell you how great it felt to just order panels, and have them delivered, and hang them, with less than 30 total minutes of my time invested!

I did a few skype interviews for SharePoint Saturday Chicago Suburbs and found that the room lighting wasn’t giving the look I wanted.

I looked at some professional lights, but they were expensive and I really didn’t have room for them in my small home office.

Ikea to the Rescue

I purchased two clamp mounted desk lamps from Ikea for $9 each. Initially I tried aiming them directly at me, but it was too distracting.

I then purchased two foam core boards from staples for about $5 each – this setup is pictured below:

When I am not “filming” I can put both Foam boards away and my room is largely back to normal.

The background behind me still felt a little dark in the resulting video so I grabbed one of those $5 clamp lights from home depot, and temporarily put it on top of my book case, pointed at the ceiling, which reflected back down and brightened up the back wall:

Screen shot of the Maintenance Window functionality in SharePoint 2013

At the SharePoint Conference this week, Session SP210 on upgrading to SP2013 mentioned a brand new feature that didn’t exit in the preview edition: MaintenanceWindows.

As you can see from the screenshot above, this feature puts up a simple banner alerting users of the upcoming work.

The message is localized in the users language so long as language packs are installed.

The “More Information” link can point to any page you specify.

I was pretty excited about this, and couldn’t wait to try it out!

The PowerShell to do this wasn’t as easy as I expected.

I’ve pasted below what worked for me.

#1st get a content database
get-SPContentDatabase #this will list them all#copy and paste a database ID and use it to assign a specific DB to a variable$ContentDB= Get-SPContentDatbase -Identity #ID goes here#now we're going to add a maintenance window to the SPContentDatabase with $ContentDB.MaintenanceWindows.Add()#before we can do that we need to create a Maintenance window object and populate it.# Parameter List "MaintanceWarning or MaintanencePlanned", Mt Start , Mt End , Notify Start, Notify End, duration , urlforinfo$MaintWindow=New-Object Microsoft.SharePoint.Administration.SPMaintenanceWindow "MaintenancePlanned","1/1/2013","1/2/2013","11/16/2012","1/3/2013","1:00:00","http://www.mydomain.com/outageinfo.html"#Parameter List for above:#1: MaintanceWarning or MaintanencePlanned,#2: Maintenance Start Date#3: Maintenance End Date#4: Notification Start Date#5: Notification End Date#6: Duration in the format of DD:HH:mm:ss - "1:00:00" = 1 hour, "1:00:00:00" = 1 day#7: URL for info# Parameters 2-5 all take a date time in this format: "1/20/2012" or "1/20/2012 5:00:00 PM" #Now we can see the properties of a single MaintenanceWindow by just typing in $MW and hitting enter:$MaintWindow#for me this looked like this:# MaintenanceStartDate : 1/1/2013 6:00:00 AM# MaintenanceEndDate : 1/2/2013 6:00:00 AM# Duration : 01:00:00# NotificationStartDate : 11/16/2012 6:00:00 AM# NotificationEndDate : 1/3/2013 6:00:00 AM# MaintenanceType : MaintenancePlanned# MaintenanceLink : http://www.mydomain.com/outageinfo.html# UpgradedPersistedProperties :#ok with that out of the way, we just need to add it to he content database$ContentDB.MaintenanceWindows.add($MaintWindow)$ContentDB.Update()

Ok so that’s it – refresh your website and you should see the pink banner on the screenshot above!

Note, I originally tried to do this by just setting up a blank object without paramters, and then setting the properties one by one, but I found that MaintenanceStartDate and NotificationStartDate could not be changed after the object was created.

I kept seeing errors in the ULS logs of other sites saying things like:

Failed to create a custom control from assembly ‘NewsGator.NewsManager.Web’ .. The type is not registered as safe.

I know from experience that this means the control isn’t listed in the web.config for the given site, nor should it be – I don’t have, nor want newsgator to have anything to do with the site in question.

I also know that the errors aren’t really hurting anything, but if nothing else they are making the ULS logs a little bigger and honestly, I don’t want a farm that has known errors in it.

So I set out to understand where they were coming from and how to safely get rid of them.

Finding these in the ULS logs

They are all over our ULS logs, but it’s nice to have a quick way to validate if they are still there so I did a search with the windows Findstr command:

findstr /C:"is not allowed for web" *.log

The first thing I wanted to do was see if there was an obvious, easy fix – ie from site settings, site features, or site collection features, is there a newsgator feature that’s enabled that I can just turn off?

I tried this and no, there wasn’t

The solution turned out to be painfully simple.

In the ULS logs, there were entries like this:

Failed to create a custom control 'CustomMenu_NewsStreamAdmin', feature 'NewsGator.NewsManager_Actions' (id:16c89384-881d-44aa-a6f5-f66301596851) using attributes (ControlSrc='', ControlAssembly='NewsGator.NewsManager.Web, Version=1.0.0.0, Culture=neutral, PublicKeyToken=a1b9791f4e4509c7', ControlClass='NewsGator.NewsManager.Web.NewsStreamAdminActions': System.ArgumentException: The control with assembly name 'NewsGator.NewsManager.Web, Version=1.0.0.0, Culture=neutral, PublicKeyToken=a1b9791f4e4509c7' class name 'NewsGator.NewsManager.Web.NewsStreamAdminActions' is not allowed for web
at URL 'https://nonnewsgatorsite.mydomain.com'. The type is not registered as safe.

The error above is always paired with another less descriptive error – but the error above turns out to have all the information we need – the id. (In this case id 16c89384-881d-44aa-a6f5-f66301596851)

In powershell, Get-SPFeature will list all the features on the farm – in my case it showed the above ID.

Now, given that newsgator is legitimately installed on our farm and on ONE web application (URL) I didn’t want to remove it from the farm!

What was helpful was the command:

get-spfeature -webapplication https://myurl.mydomain.com

This showed that the feature was associated with that web application and it also showed that it was webapplication scoped.

I had to do this for a few different features – pulling the ID from the ULS logs and running the disable command- While I’m sure it potentially could be automated, I preferred handling it “Hands On” doing them one at a time and confirming my SharePoint sites still worked as expected.

After that, the errors in the ULS log stopped for that site, and get-spfeature -webapplication https://myurl.mydomain.com no longer showed that feature.

It was a great feeling to get these nagging recurring ULS entries to stop!

Something went wrong and FAST is returning results for files that no longer exist.

The “Normal” way to fix this is to do another crawl of the content source – In this case, it did not work.

The “best” way to fix this is to reset the index and re-crawl all the content.

Unfortunately, because of the size of our fast install, this is not practical – it takes over a week to index everything.

In other words, fixing this problem the “right” way will also bring down fast for at least a week for some content – not good.

On a support call with Microsoft – they told me of a quick way to remove individual results – it’s not quite as effective as a full index reset, but it has it’s place – for example – say that a confidential document got crawled, and the summary is showing up in search results. You’d want to get that out of the way right away – this approach can be good for that.

First download the free tool FS4SP Query Logger by Mikael Svenson – I found version 3 on codeplex with a quick internet search.
Run this on the fast server and click the start logging button, then go do your search using whatever search page is returning the bad results.
Once you see your search term show up on the upper left, look at the result XML and find the result.
you’ll want to grab the value of the “contentid” field – it will look something like this:

<FIELDNAME="contentid">ssic://SomeNumberHere</FIELD>

Be sure the Area of XML you are looking at matches the search result you are trying to eliminate!

now, also on the fast server, open a FAST powershell command.
enter the command:
DOCPUSH -c sp -U -d ssic://YourNumberHere

Just like that, your search result should stop appearing in search results.

—————————————————————————–

As a side note, while we were looking at some things, we used a clever powershell command to search multiple directories for some text

Select-string is like Grep in Unix or Findstr in windows – it looks for strings.
what was neat here was the Regular expression for the path it limited the search to just a few key directories. – ie
\2\232131231231\index_data\urlmap_sorted.txt
and
\3\4223453\index_data\urlmap_sorted.txt

Creating a new SPSessionStateService, on the other hand is a little more involved…

How do I know?

I’m glad you asked….

I ran into an issue where an access report would not display because “session state is not turned on” it didn’t say which one, and through some trial and error, I now understand it was likely looking for the service returned by get-SPSessionStateService.

For me that returned a blank line with no database entry so I thought I’d be best to delete it and recreate it from scratch.

I was wrong.

While deleting and recreating the SPStateServiceApplication is easy, the SPSessionStateService was not easily done in SP2010 with the included powershell commands.

I enabled the ASP.Net state windows service, then followed the article above, stopping about half way through, before the provisioning part.

To Provision it, I used Enable-SPSessionStateService -DefaultProvision

Get-SPSessionStateService now returns a complete row, with a database server, and DB name, and ID and best of all Enabled = True

So to summarize my problem,
MS Access services reports needed “SPSessionStateService” which also uses the windows service “ASP.Net State Service”

In troubleshooting, I wasn’t aware of the difference between states so I deleted the “wrong” one in an attempt to reset it.
A little digging and I now have a better understanding of the issue and of the two different state services.

VMware Fusion 5 was released this week.
Here are a few quick thoughts…

They now sell a “regular” ($49) and “Professional” ($99) version

Now with the introduction of the Professional edition, VMware is only selling upgrades to Fusion 5 Professional for $49

At first this might seem unfair, but I checked my order history and I paid $49 for the upgrade from v3 to v4 so the upgrade price is the same as the last go around, and you’re getting the higher end version of the product.

Now for the good news…

Historically VMware Workstation on the PC has had more features and was better suited to running VM labs for trying out new stuff.

Fusion seemed to be more consumer focused – Ie it was a good fit for someone who needed to run one copy of windows 7, but it wasn’t as good as Workstation for someone trying to manage, say a dozen or two virtual machines for various labs.

Fusion 5 Pro introduced 2 new features that just made this a lot more viable on the mac:

– Folders – this is such a simple, but necessary container – If you have more than a few VM’s it’s useful to be able to group them into folders.

– Networking – This isn’t as good as I remember it on the PC version – there’s no settings to limit bandwidth (useful for simulating slower connections) but it’s nice to see them add the feature. It does mean that you can likely setup multiple, isolated private networks (useful for isolating VM labs from each other)

– Lastly an annoying behavior they added with VMware Fusion 4 has been resolved – it used to be that when you launched a VM, the “Virtual machine library” would dissapear – it was a bit of a pain if you had to kick off 3 or 4 VM’s – this window stays put now.

– the Virtual Machine Library also now features both a list and an icon view. The list view is very nice, with a tree view on the left of your VM’s (and folders) and a preview window on the right. under the preview window it shows a few lines of the notes field so you can easily see what the selected VM is all about. Under that is a storage breakdown and the lines of notes which are displayed very visibly now – this is great if you leave yourself notes on what each VM is for. Under that is storage graph and the ability to see how much disk space you can reclaim.

On my environment, I upgraded from v4 – one of my Vm’s (windows 7) was suspended and that worked fine in 5, but it warned me that some things were different and recommended that I shut down the machine so that the vmware tools could be upgraded and so that the compatibility setting could be updated.

Another machine had a snapshot and that seemed to go ok as well – Though as a precaution, after the Vmware tools were updated I deleted my snapshot then created a new, current one.

This is done by editing the web.config of each SharePoint site, on each SharePoint server.

I wrote this down and dirty script so I would not need to edit all the web.config’s by hand (I had about 20 web.configs to touch)

Note that since its just editing the web.config, we don’t need to run this in a SharePoint shell – I ran it from an ordinary PowerShell command prompt on my workstation.

The script:

Echo"Run this script under an admin account with rights to the servers being hit"$dir="\\Server\c$\inetpub\wwwroot\wss\VirtualDirectories"$currentDate=(get-date).tostring("mm_dd_yyyy-hh_mm_ss")# loop through each subdirectory to find each sharepoint site.foreach($subdirindir$dir){# Here In my case, all my SharePoint sites had mydomain.com as part of the folder names,# So the contains statement was an easy way to only touch the web.config's of actual SharePoint sites# while leaving alone central admin and other non-SharePoint websites IIS had.if($subdir.Name.Contains("mydomain.com")){$path=$dir+"\"+$subdir.name +"\Web.config"echo$path$backup=$path+"_$currentDate.bak"$xml=New-Object XML
$xml.Load($path)$xml.Save($backup)$element=$xml.configuration.SharePoint.BlobCache
echo$element$element.location ="X:\BlobCache\14"$element.enabled ="true"$element.maxSize ="5"echo$element$xml.Save($path)}}

If search results from SharePoint (not Fast) search are not showing the right title, and instead are showing a few words from the content of the document, theres a registry setting you can set to fix that.

The registry setting will be found on any machine running sharepoint search (Central admin-> Manage servers in this farm will show you which boxes have this role)

These powershell commands will show what the value is currently (or an error if its not found – a good sign that you’re on the wrong machine!)

I recently ran into a few issues installing SharePoint 2013 that can easily be avoided by installing in a given order.

Firstly for the quick install, this seemed problem free:
1) Build a new windows 2008R2 SP1 Standard edition VM
(I gave mine 3GB of ram)
1a) I joined it to a domain (I did not build the VM as a domain controller)
2) be sure you have an internet connection
3) install sharepoint 2013 preview, choose the standdalone version. (the one that will install the free version of SQL2008 R2
This seemed to work fine.

The trouble came when I tried to install it with the latest version of SQL
Here’s what I did:
(Warning: this fails!)
1) Build a new windows 2008 R2 SP1 standard edition VM
2) Install the RTM version of SQL2012 – it needed .net 4 and IIS
(I think this is where things went wrong, SQL2012 configued IIS with .net 4, not 4.5)
3) tried to run sharepoint install – it failed on the pre-reqs – it seems in my case it could not determine if IIS was setup with .net properly. I even tried the old aspnet_regiis -i command to force it but still the pre-req installer would stop at that point.

I must admin, I was in a hurry to see the new SharePoint so I didn’t take additional troubleshooting steps (I could have downloaded the rest of the prereq’s manually and tried the SharePoint install, but I did not)

Instead I figured I’d try a different approach – first I tried the install I listed at the top of this article – that worked like a champ, but I wanted the SQL2012 based install.

Next I did this:
1) new VM with Windows Server 2008 R2 SP1
2) Run the pre-req installer from the sharepoint 2013 install iso.
(this configured IIS, and ASP.net 4.5)
3) Do NOT run the SharePoint installer
4) Install SQL 2012 – since IIS and ASP.net 4.5 are already installed it should leave them alone.
5) Come back and do the SharePoint install
It’s installing now – I’ll post the results when it’s done.

Our sharepoint farm was in Domain A and we wanted to grant rights to a group in Domain B.
It worked fine from the GUI but powershell add-spuser or new-spuser failed – both stating the user ID we were adding was no good.
Specifically this was for Mysites – we had thousands of them so doing it by hand wasn’t an option.

<key>MacBookPro6,2</key><dict><key>LogControl</key><integer>0</integer><key>Vendor10deDevice0a29</key><dict><key>BoostPState</key><array><integer>0</integer><integer>1</integer><integer>2</integer><integer>3</integer></array><key>BoostTime</key><array><integer>3</integer><integer>3</integer><integer>3</integer><integer>3</integer></array><key>Heuristic</key><dict><key>ID</key><integer>0</integer><key>IdleInterval</key>
Changed to 250, was 100 ----------------------> <integer>250</integer>
This key is new in 10.7.x --------------------> <key>P3HistoryLength</key>
the key and value did not exist before +-----> <integer>2</integer><key>SensorOption</key><integer>1</integer><key>SensorSampleRate</key>
Changed to 4 was 10 --------------------------> <integer>4</integer><key>TargetCount</key><integer>1</integer><key>Threshold_High</key><array><integer>57</integer><integer>70</integer>
Changed to 88 was 80 --------------------------------> <integer>88</integer><integer>100</integer></array><key>Threshold_High_v</key><array><integer>1</integer><integer>3</integer><integer>98</integer><integer>100</integer></array><key>Threshold_Low</key><array><integer>0</integer><integer>68</integer><integer>75</integer><integer>95</integer></array><key>Threshold_Low_v</key><array><integer>0</integer><integer>2</integer><integer>4</integer><integer>99</integer></array></dict><key>control-id</key><integer>17</integer></dict>

I recently had a stuck timer job in our sharepoint farm.
It seemed like an easy thing for Powershell, but it turned out to be one step more complicated – I’m not sure why, but here’s the solution I used – thanks to Todd from the Vendor I was working on for providing the fix!

We can use the Cmdlet get-SPTimerJob to see all timerjobs in our sharepoint farm.

If we add a nice little where clause, we can limit the list to a single item:

Get-SPTimerJob |where{$_.name -like"Name of your stuck job"}

Normally I’ve been able to assign the results to a variable

ie like this:

$badjob= Get-SPTimerJob |where{$_.name -like"Name of your stuck job"}

Which works.
What didn’t work however was this:

$badjob.delete()

For some reason, I got an error that there was no delete method.
Weird.

So instead:

Get-SPTimerJob |where{$_.name -like"Name of your stuck job"}|fl# I then read the ID from the output of the above (note I added | fl at the end) # and I copied and pasted it into this command:$badjobTake2= Get-SPTimerJob -ID(pasted the ID here)$badjobTake2.Delete()#this worked

I’m not sure what the difference is, maybe I even fat fingered it the first time..
but that’s how it got resolved.