shim boot loader is blocked after enrolling MOK - fedora

I installed Fedora 34 on Dell 5510 laptop, everything worked fine until I generated and enrolled MOK to use secure boot (used mokutil --import, then rebooted and enrolled in MOK manager). After enrolling MOK bios started to randomly block shim boot loader (more than 50% cases). I had to set secure boot to audit mode to prevent this.
EDIT: after testing a bit more I'm inclined to think that shim actually hangs rather than being blocked. And the only combination that allows it to proceed is secure boot enabled in audit mode. Enabled in deploy mode - hangs, disabled - hangs.

Related

.NET dll using different TLS versions

We have a Microsoft Access application that consolidates many of our activities on eBay. I do this through several com dll's that Access references. One part of the application posts listings from our inventory to eBay. This has been working for at least 8 years. Recently eBay changed it's servers to require TLS 1.2 and we started getting "The request was aborted: Could not create SSL/TLS secure channel" message. Looked this up and MS says upgrading the dll project to Framework 4.7 or later will fix. I upgraded the solution framework and recompiled.
The upgraded dll works with eBay when I call from a win forms test application built in VS. When I call from the Access application I still get the TLS error. When I looked at the communications with eBay in Fiddler the calls coming from the dll when called from the .NET win forms test app use TLS 1.2. When the dll is called from Access it is still using TLS 1.0. Both the Access app and the .NET test app reference the same dll file.
Question: does the application calling a dll influence the TLS version being used? Any ideas on what could be causing different TLS version being used by the same dll?
Hum, I did not think that VBA/Office being the host program would change the TLS settings. But you could consider trying this registry edit:
[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\.NETFramework\v2.0.50727]
"SystemDefaultTlsVersions" = dword:00000001
"SchUseStrongCrypto" = dword:00000001
[HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Microsoft\.NETFramework\v4.0.30319]
"SystemDefaultTlsVersions" = dword:00000001
"SchUseStrongCrypto" = dword:00000001
I believe you also have to re-start your computer for above to take effect.

Trouble with Xdebug remote_connect_back MacOS Catalina 10.15.7

Let me say first and foremost, I am not trying to INSTALL Xdebug locally on my Mac and can not run the application on my Mac. I also can not downgrade my OS below Catalina so let's just not discuss those options please...
I got a new Macbook Pro running MacOS Catalina (10.15.7) and can not get Xdebug working to save my life. The PHP application runs on a SHARED development server hosted on Amazon. Seeing as how the environment is shared among several developers, we use Xdebug via remote_connect_back setting.
With my old Macbook Pro running High Sierra (10.13.3), it connects just fine and step debugging has been working for years. The result is the same with all of my co-workers: those on Mojave and below have no issue and Xdebug works wonderfully. All of us with Catalina and higher can not get Xdebug to connect to our machines from the shared dev server. We are all running PhpStorm 2020.3, and use the Chrome Xdebug helper plugin (with "phpstorm" set as the IDE key).
I have set up this configuration at least 2 dozen times (and wrote the company guide for setting up the dev environments including Xdebug), so it's a bit embarrassing that it won't work on newer Macbooks.
The computer I am currently testing has the firewall disabled, no internet security, clean install as of yesterday. Also tried with gatekeeper disabled, rootless protection disabled and Allow Remote connections from ANYONE. All to no avail. On my previous laptop, I can ssh into the dev server, and open a telnet session on port 9000 to my local laptop and it works fine. On the NEW computer, if I attempt a telnet session on port 9000 it just times out.
What is interesting is if I turn OFF listening for Xdebug connections in PhpStorm and run the telnet command again, I IMMEDIATELY get a connection refused message. So I KNOW I am getting the right machine, it's almost as if Apple implemented some security through obscurity logic between Mojave and Catalina.
It's also worth noting that one co-worker is running Catalina, but version 10.15.2 and his connects just fine and he is refusing to update (reasonably so). Any help would be greatly appreciated.
Edit/Update: I have tried both xdebug 2.7.2 via remote connect back on port 9000 and 3.02 via discover remote host on port 9003.
Second Edit/Update: If I open a reverse ssh tunnel on my laptop (ssh -N -R 9003:127.0.0.1:9003 dev_server_host) I can get Xdebug to work. However this solution isn't practical on a SHARED dev server as if more than one person does this we get cross talk. Just noting it to help with troubleshooting.
Third Edit/Update: This issue is still unresolved so I would love to find a solution at some point, but I have moved on by working with coworkers to get it running locally on docker. On docker we have no issue getting xdebug to connect. Our unit tests run a bit slower in docker, whether the code is mounted via a volume or not, but that is another topic for another day.

Add a Windows node to Openshift OKD v.3.11

Since Docker can now run on Windows, is there a way to deploy Openshift OKD over a Windows VM?
In the documentation under System and environment requirements we can read that rhel family OS are needed, but I'm just wondering if there is a side process (alternative) process to perform this operation.
My main concern is that I need to run Windows containers on OKD.
The answer is that for OKD 3.11 this is not possible and has to do with the networking (OVS) not being available for Windows machines.
That being said, there is a lot of information available for Windows Container in Kubernetes itself, although there are A LOT of things that are not implemented or are not supported at this time: https://kubernetes.io/docs/setup/production-environment/windows/intro-windows-in-kubernetes/
You can expect Windows Containers to become available in OKD 4.5 or later as Tech Preview, but I personally would not hold my breath.

I get a Blank screen when using Windows IoT Remote Client

My Setup:
Windows 10 VM running in VirtualBox on Windows 7 Pro
Raspberry Pi 3 running Windows 10 IoT Core - 10.0.16299.19
The VM can see the RaspberryPi/Wionows IoT
I know this because:
On the VMI I can Install and Debug from Visual Studio 2017 to the Pi.
On the VM the IoT Dashboard detects the PI and allows me to change
settings
On the VM I can access the Pi's Device Portal
On the VM I can use Powershell to log in to the Pi.
What I can't do is use Windows IoT Remote Client. Which I want so I can see changes produced by my code.
When I start the client I get the spinning buffer animation followed by a blank, white screen.
I have tried the following as recommended in web articles:
Reset the Pi resolution to 800x600 - This killed the Pi's ability to
display at all; including on the attached HDMI.
Checked Enable Windows IoT Remote Server in Device Portal. - This is
set to On
Disconnected the HDMI from the Pi - Made no difference.
Typed the following into an Admin level PowerShell:
net start WinRM
Set-Item WSMan:\localhost\Client\TrustedHosts -Value
PiName
This allowed Powershell access but no change to Remote Desktop
What should I try next?
On version 16299,the Windows IoT Remote client does not work for Raspberry Pi. Please reference the know issue of release notes for Windows 10 IoT 16299.Currently you can attach a monitor for local display.
Try starting the NanoRDP server manually (not through the web interface) and see if that helps. That has solved a few issues for me.
I believe the executable is in c:\windows\ and is called nanordpserver.exe. just SSH or Powershell into the device, run nanordpserver.exe and try again.
The lightweight RDP protocol that IoT Core uses is not as robust as the version installed on the full Windows 10 OS.

Open Source Application Server Solution

A project with the following technologies and components has surfaced: to up a Web stack solution initially composed with Ubuntu, JDK, JBoss, Spring MVC 3.0+, and MySQL.
In planning this project, I have been struggling to find answers to the following questions for first steps, best practices, and sequence:
1) Does the JDK (and JBoss) need to be installed as ‘root’? (I have seen articles that mention it is not a good idea to operate in root unless absolutely necessary due to the fatal consequences.)
2) Does Ubuntu need to be installed as a Server in order to accomplish all this, or can it also be installed as a Desktop? I have not been able to determine if having a JBoss and MySQL need to be installed on top of Linux as a server.
3) Does Maven need to be used within Spring STS in order to get JBoss, and MySQL (and in the future Hibernate) to work successfully together?
4) My intent is to install in this order: a) Ubuntu -> b) Java -> c) JBoss -> d) Spring STS -> and e) MySQL. Are there any blatant conflicts in this sequence?
JBoss will require Java (recommend Java 7) before it will do anything. I don't think it really "installs" per-say, but rather just unpacks to some directory (even if you install from the package manager, it just really extracts itself). I question your need for Spring since JBoss and Java EE in general really does everything Spring does, and better now-a-days. Unless you have a specific requirement for Spring, I'd question this extra dependency.
For linux - in a high level, any OS can be a "server", all it needs is to be capable of serving things (web pages, ssh connections, etc). In M$ world, different "levels" of the OS have been specially designed based on anticipated task/workload. So for example, while Windows 7 can indeed run as a server, it was not designed for it and therefore may not be optimized or include helper utilities and tools to make life easier as a sys admin of the system. Windows Server on the other hand does include all the "normal" server tools and lots of goodies to make maintaining and setting the server up easier.
In linux land, this is no such thing. Linux is the kernel that talks back and forth with the bare metal, etc... and Distro makers will take that and build an OS around the kernel, basically just attaching any packages they feel their distro needs... such as wget, or cat, or any other standard userland apps, plus some non-standard such as mysql or java or whatever they want.
Now, some distributions of linux will tailor themselves at being "server" ready, while others will tailor themselves at being a desktop OS. The difference? It's really just whatever default packages the distribution maker decides to include or not. For example, the overwhelming majority of linux servers are run completely headless, and therefore there is absolutely no reason to have X11 and a huge bloated GUI environment installed and/or running on that system... it's pointless. Also, an "average joe" user does not need MySQL installed by default on his desktop system since it would only bloat his system and he likely won't ever use it.
So basically it comes down to default installed packages.
Some linux server distros take this further and exercise extreme caution when making updates, patches, or new releases in the name of stability and security, while on the other hand most desktop distros are more haphazard with their updates since if it breaks a home users web browser, it's probably not a huge deal... but if a server update breaks the webserver application stack, now that's a serious problem.
So you'll find server OS's like CentOS (based on upstream RHEL) are extremely slow to bring in the "latest and greatest" features that desktop OS's get early on. Their goal is high security and long term stability.
Now, for Ubuntu. While I certainty know a lot of folks run Ubuntu as their server OS choice (partly due to Amazon choosing Ubuntu as the default linux VM for their ECS cloud), but I'd really question this. Ubuntu is not focused on being a server. It's focused on being a great all-around desktop oriented OS. Yes the LTS version is meant for long term stability, but it's based out of a desktop OS, so it's still not the focus.
IMHO, I'd go with CentOS because it's free and completely binary compatible version of RHEL - and RHEL is the de-facto standard for enterprise-grade linux servers. Be aware though, the RHEL way of doing things is a bit different than the debian way -- so there is no apt-get, you must use yum install instead. Startup scripts are different and some ways of doing things are different, but really, once you know linux, you know linux.
EDIT: Also check into Jenkins - its a free opensource continuous integration system that runs on JBoss or Tomcat or any other container, and can automagically pull your code from a repo (github, git, svn, etc) and compile/package it then push it to live deployment. You setup your ANT or Maven build scripts, and it can kick off on a schedule or however you configure it.
EDIT EDIT: I'd also recommend using OpenJDK -- as it's likely included in your package manager (for just about every disto) and will be more updated than the oracle version if it's in your package manager too. I've found most "server" distros will have OpenJDK 7 while only having Oracle java 6 in their package managers. Also, installing it via the package manager will enable you to keep it updated a ton easier.
Installed as root, why not? Run as root, probably not a good idea.
If you want a desktop, install a desktop distrib. If you want a server, install a server distrib. This doesn't change what can and can't be run in the OS. It only changes what is installed by default.
Maven is a build tool. JBoss doesn't care how you build your app. All it cares about is if the application you deploy is a valid Java EE application.
No. You need an OS, so Ubuntu must come first. JBoss and (AFAIK) Spring STS need a JRE to run, as they're Java applications, so Java should be installed before them. MySQL is independent of JBoss, STS and Java, so you can install it whenever you want.
Note that if you're struggling just with this installation part, be prepared to suffer with the rest. Building a Java EE webapp is not a piece of cake, and you should probably find some experienced developer to help you, as it seems you're only beginning with Java.