My project depends on a couple of third party libraries, which bring in okio v. 1.9.0 and 1.2.0, respectively.
Can I count on backward compatibility, and just use okio version 1.9.0 (in place of 1.2.0)?
Yes, all 1.x versions of Okio are API-compatible. Check the CHANGELOG for details.
Related
ROS2 humble distro currently supports C++17. Can it support C++20 modules?
Is there any prospect for a future ROS2 distro to support, be compatible with C++20, or leverage C++20 features?
Is there any plan for a future ROS2 distro that compiles with C++20
One that utilises C++20 features (especially C++20 modules)
How far are we from supporting C++20 modules? Will "modules" ever be supported, or are they in conflict with how ROS2 is designed?
Today, you can build your own packages with C++20, and link them to ROS 2 packages build with C++17 just fine.
Be aware though: not all C++20 features are supported on all compilers. Of note: as of writing, C++20 Modules are only partially supported on GCC, Clang, and MSVC, so I would guess a full migration of the ROS core to C++20 modules is a long way off. See https://en.cppreference.com/w/cpp/compiler_support for a more detailed summary.
As far as ROS2 design, plans, and direction, you should probably post on https://discourse.ros.org/
In order to use my custom TF model through WinML, I converted it to onnx using the tf2onnx converter. The conversion finally worked using opset 11. Unfortunately I cannot load the model in the WinRT c++ library, therefore I am confused about the opset support: According to the Release Notes, the latest WinML release in May supports opset 11. I updated my VS2019 and downloaded the latest Windows 10 SDK, does the c++ API already include the latest onnx support? Or is there any alternative way to use my model in WinML c++?
The latest release of Windows OS contains support for opset 9.
The latest release of Microsoft.AI.MachineLearning NuGet package contains support for opset 11.
Please refer to these release notes: https://learn.microsoft.com/en-us/windows/ai/windows-ml/release-notes
You can find the latest Microsoft.AI.MachineLearning NuGet package here: https://www.nuget.org/packages/Microsoft.AI.MachineLearning/
As #Kookei mentioned, there are 2 ways of building WinML: the "In-Box" way and the NuGet way.
In-Box basically just means link to whatever WinML DLLs that are included with Windows itself (e.g., in C:\Window\System32).
The NuGet package contains its own more recent set of DLLs, which other than providing support for the latest ONNX opset, has the obvious advantage of allowing you to easily distribute your binary to older versions of Windows lacking any built-in machine learning capability. Just install the package through Visual Studio's Nuget Package Manager, and build your solution; and you'll find that the output directory now contains the needed DLLs (currently directml.dll, Microsoft.AI.MachineLearning.dll, and onnxruntime.dll) along with the generated EXE, ready for same-folder deployment.
In terms of source code, this is how the two versions are distinguished:
In-Box:
#include <winrt/Windows.AI.MachineLearning.h>
using WinMLModel = winrt::windows::AI::MachineLearning
NuGet:
#include <winrt/Microsoft.AI.MachineLearning.h>
using WinMLModel = winrt::Microsoft::AI::MachineLearning
In other words, the only difference is whether you use the Window or Microsoft header/namespace.
you can also track the supported opset versions at the version matrix table.
I was reading the official Nuxt.js guide, but it says it is only supporting ES6/ES7.
In normal cases, the newest version of ES8 or ES9, they are ignored or not supported yet?
Is there any reason that ES6 is used as a term to mention the newest Javascript version than latest one?
The newser ES standards don't need to be supported by these libraries. You can normally provide support for later standards and experimental features by babel.
Found a similar question about the difference between QAPlug-FindBugs and FindBugs-IDEA, but not about CheckStyle...
I think (from previous experience) with these plug-ins that one difference is that the QAPlug's CheckStyle version seems to be behind the CheckStyle-IDEA's CheckStyle version (I'm not referring to the versions of the plug-ins, but of the version of CheckStyle that's embedded in them).
Anyone knows of any other differences between those?
Advantages of CheckStyle-IDEA over QAPlug – Checkstyle:
Newer version of Checkstyle used
Community on GitHub
Fast scan before commit
Support for multiple configurations
Support for third-party checks
I haven't noticed any use case when QAPlug is better.
Backward compatibility means (as I understand it) that newer versions of the code will:
Still be able to read data written by older versions
Still be able to operate with older versions using some network protocol
Basically, nothing will break when I upgrade a system to a newer version, even if it is just one part of a larger system, or uses resources created by the older system
But what is "Forward Compaitible"?
Forward compatible bascially means that the code will be compatible with newer versions of the software. In other words, code you enter now will smoothly transition to the next software version. You could also think of it that the next version(s) of software will be reverse compatible with the current version.
https://en.wikipedia.org/wiki/Forward_compatibility
Backward compatibility vs forward compatibility
We have two versions of a software: the version A, and a newer version B. We say:
B is backward compatible with A if everything that works on A will still work on B.
A is forward compatible with B if everything that works on A will still work on B.
B is backward compatible if it is backward compatible with all the versions < B.
A is forward compatible if it is forward compatible with all the versions > A.