Language Selection

English French German Italian Portuguese Spanish

Kde Planet

Syndicate content
Planet KDE - http://planetKDE.org/
Updated: 2 hours 49 min ago

KDevelop 5.3.3 released

Wednesday 17th of July 2019 09:07:35 PM

KDevelop 5.3.3 released

We today provide a stabilization and bugfix release with version 5.3.3. This is a bugfix-only release, which introduces no new features and as such is a safe and recommended update for everyone currently using a previous version of KDevelop 5.3.

You can find a Linux AppImage as well as the source code archives on our Download page. Windows installers are currently not offered, we are looking for someone interested to take care of that.

kdevelop
  • Use KDE_INSTALL_LOGGINGCATEGORIESDIR for kdebugsettings .categories files. (commit. )
  • TextDocument: remove actions from contextmenu on hide already. (commit. code review D22424)
  • Sublime: fix crash on undocking toolviews with Qt 5.13. (commit. fixes bug #409790)
  • Kdevplatform/interfaces: fix missing explicit QVector include. (commit. )
  • Fix kdevelopui.rc: bump version as required by string context changes. (commit. )
  • Shell: overwrite katectagsplugin to be disabled by default. (commit. )
  • Translate relative paths of input files to absolute ones. (commit. )
  • Welcome page: do not add currently unused qml pages to qrc data. (commit. )
  • Fix browse mode not disabled after Ctrl is released. (commit. )
  • Attempt to fix a crash on shutdown. (commit. )
  • ProblemHighlighter: Fix mark type handling. (commit. )
  • Cmakebuilddirchooser: Set a minimum size. (commit. )
  • Fix memory leaks reported by ASAN. (commit. )
  • Qmake: Move builder plugin to correct category. (commit. fixes bug #407396)
  • Add DesktopEntry to notifyrc. (commit. code review D20920)
  • Output config subpages alphabetically, instead of order in which corresponding plugins were loaded. (commit. code review D14391)
  • Flatpak plugin - fix typo ("flies" -> "files"). (commit. )
kdev-python
  • Use KDE_INSTALL_LOGGINGCATEGORIESDIR for kdebugsettings .categories file. (commit. )
kdev-php
  • Use KDE_INSTALL_LOGGINGCATEGORIESDIR for kdebugsettings .categories file. (commit. )
  • Update phpfunctions.php to phpdoc revision 347011. (commit. )
  • Parse more function aliases. (commit. )
  • More special handling for mcrypt constants. (commit. )
  • Parse more class constants. (commit. )
  • Parse more constants. (commit. )
  • Parse constants under para. (commit. )
  • Parse more constants within tables. (commit. )
  • Parse url stat constants. (commit. )
  • Parse php token constants. (commit. )
  • Parse file upload constants. (commit. )
  • Parse constants under tables. (commit. )
  • Parse constants under section and within tables. (commit. )
  • Parse nl-langinfo constants. (commit. )
  • Parse constants split over multiple lists within one file. (commit. )
  • Fix function declarations with varargs. (commit. )
kossebau Wed, 2019/07/17 - 23:07 Category News Tags release 5.3

Latte, Documentation and Reports...

Wednesday 17th of July 2019 03:25:51 PM


First Latte beta release for v0.9.0 is getting ready and I am really happy about it :) . But today instead of talking for the beta release I am going to focus at two last minute "arrivals" for v0.9; that is Layouts Reports and Documentation. If you want to read first the previous article you can do so at Latte and "Flexible" settings...



Layouts Reports

There reports are targeting the most experienced Latte users that use different layouts and work frequently with the hidden but important Layouts Editor. In that case you may have noticed a message dialog that is appearing when opening the Layouts Editor when one or more layouts are broken.


- settings window in v0.9 -
To access these new Reports for Layouts you can use the settings dialog menu and more specific File → Screens and Layout → Information






Screens Report is produced from ~/.config/lattedockrc file and as you can see in the relevant screenshot provides screen ids and names, which screens are active, which one is the primary or secondary and which screen has no docks/panels assigned in the system and can be safely removed










On the other hand Layout Information Report provides dock/panel ids, which screen and edge they are assigned at, if they are currently active or not, what are the systray ids and if there are any orphan systrays/containments that can be safely removed when this layout is not active. At the end of the dialog there is an errors section that informs about the layout healthiness. If the layout is broken or is considered suspicious for instability (e.g. when there are different applets that use the same id  even though they should not) then the relevant errors and information appear.





Documentation
First I want to thank the great Chris Raven that approached me through reddit /u/Magentium and helped me upload the technical documentation in techbase.kde.org ! Very important job and took off my shoulders a big burden.

So if you are a developer and you are interested to make your qml plasma applet to work nice with Latte or you want to provide your own majestic Latte Indicator through kde store then you can now find all the needed information.


- Latte Dock at techbase.kde.org -
For those that have missed it in the past all Latte user documentation has moved in the kde userbase website. I know that some of the information might be outdated but nonetheless as a community we can all help.

- Latte Dock at userbase.kde.org -

Latte v0.9 Release Schedule

  • Tomorrow OR on Friday: v0.9 first Beta release [tagged v0.8.97], bugs / translation strings fixes  and improvements for ten days
  • End July 2019: v0.9 will be released officially as the new Latte stable version



How Can I Help?

Bugs, bugs, bugs.... Translations, translation, translations...
  1. As you may noticed plenty new settings are added in v0.9 and bugs can exist when combining options
  2. Even though the kde localization teams are checking out the translation strings almost daily and I THANK THEM for this!! We are humans and translation strings may be possible to improve
  3. For complicated settings I use tooltips in order to describe them better. If you find such option that does not have any tooltip OR its tooltip text can be explained more or be simplified feel free to report it (I am not a native english speaker)



Donations

You can find Latte at Liberapay if you want to support,    

or you can split your donation between my active projects in kde store.

Krita 4.2.3 Released

Wednesday 17th of July 2019 08:48:55 AM

Today we’re releasing Krita 4.2.3. This is mostly a bug fix release, but has one new feature: it is now possible to rotate the canvas with a two-finger touch gesture. This feature was implemented by Sharaf Zaman for his 2019 Google Summer of Code work of porting Krita to Android. The feature also works on other platforms, of course.

The most important bug fix is a workaround for Windows installations with broken, outdated or insufficient graphics drivers. The core of the issue is that our development platform, Qt, in its current version needs a working OpenGL or Direct3D installation as soon as there is a single component in the application that uses QML, a technology for creating user interfaces. We have managed to work around this issue and especially users of Windows 7 systems that have become a bit messy should be able to run Krita again.

Bugs Fixed
  • Make it possible for Krita to use a software renderer on Windows (BUG:408872)
  • Fix the caption of the Background Color color selection dialog (BUG:407658)
  • Fix the tag selector combobox so it is possible to select resources that have a tag that’s longer than fits in the combobox (BUG:408053)
  • Make it possible for the Krita startup window to become as narrow as before adding the news widget (BUG:408504)
  • Fix copy/pasting of animation frames (BUG:408421, BUG:404595)
  • Make the polygon and outline selection tool handle the control modifier correctly (BUG:376007)
  • Add a reload script button to the Python scripter plugin
  • Fix a crash in the Overview docker when there is no image open
  • Fix drag and drop of fill layers between opened files (BUG:408019)
  • Fix loading EXR files that have more than one layer with the same name (BUG:409552)
  • Hide vanishing points preview lines when assistants are hidden (BUG:396158)
  • Fix the Mirror All Layers Horizontally function
  • Fix switching profile to default when changing channel depth in the New Image dialog (BUG:406700)
  • Disable AVG optimizations for some 32 bit blending modes (BUG:404133)
  • Fix a crash when pressing cancel when trying to create an 8 bit/channel linear gamma RGB image
  • Fix colors drifting when using the native macOS color selection dialog (BUG:407880)
  • Make sure Stroke Selection paint correctly with the selection border in the middle of the selection (BUG:409254)
  • Fix saving Krita when perspective assistants are present (BUG:409249)
  • Fix issues with transformations being pixelated (BUG:409280)
  • Make it possible to hide all layers except the selected one with shift-click (BUG:376086)
  • Fix cloning keyframe channels that are not opacity channels
  • Fix a hang when trying to paint while playing an animation of an empty image (BUG:408749)
  • Fix Isolated Mode when multiple windows are open (BUG:408150)
  • Make the gradient editor show the right editor for stop and segmented gradients (but creating new gradients in Krita is still broken)
  • Make Krita use the integrated GPU on dual-GPU Apple computers
  • Fix a freeze when pressing delete when making a polygonal selection (BUG:408843)
  • Fix the –export commandline option to return 0 when the export is successful (BUG:409133)
  • Fix support for the KDE Plasma global menu (BUG:408015)
  • Fix a crash when using the shrink option of the deform brush (BUG:408887)
Download Windows

Note for Windows users: if you encounter crashes, please follow these instructions to use the debug symbols so we can figure out where Krita crashes.

Linux

(If, for some reason, Firefox thinks it needs to load this as text: to download, right-click on the link.)

OSX

Note: the gmic-qt is not available on OSX.

Source code md5sum

For all downloads:

Key

The Linux appimage and the source .tar.gz tarball are signed. You can retrieve the public key over https here:  0x58b9596c722ea3bd.asc. The signatures are here (filenames ending in .sig).

Support Krita

Krita is a free and open source project. Please consider supporting the project with donations or by buying training videos or the artbook! With your support, we can keep the core team working on Krita full-time.

Kaidan 0.4.1 released!

Tuesday 16th of July 2019 01:00:00 PM

After some problems were encountered in Kaidan 0.4.1, we tried to fix the most urgent bugs.

Changelog
  • Fix SSL problems for AppImage (lnj)
  • Fix connection problems (lnj)
  • Keep QXmpp v0.8.3 compatibility (lnj)
Download

Plasma sprint, 2019 edition; personal updates

Tuesday 16th of July 2019 01:49:52 AM
KDE Project:

In June, I had a great time at a series of KDE events held in the offices of Slimbook, makers of fantastic Neon-powered laptops, at the outskirts of Valencia, Spain. Following on from a two-day KDE e.V. board of directors meeting, the main event was the 2019 edition of the Plasma development sprint. The location proved to be quite ideal for everything. Slimbook graciously provided us with two lovely adjacent meeting rooms for Plasma and the co-located KDE Usability & Productivity sprint, allowing the groups to mix and seperate as our topics demanded - a well-conceived spatial analog for the tight relationship and overlap between the two.


The Plasma team walked the gorgeous Jardí del Túria almost every day during their sprint week to stay healthy and happy devs.

As always during a Plasma sprint, we used this opportunity to lock down a number of important development decisions. Release schedules, coordinating the next push on Plasma/Wayland and a new stab at improving the desktop configuration experience stand out to me, but as the Dot post does a fine job providing the general rundown, I'll focus on decisions made for the Task Manager widgets I maintain.

On one of the sprint mornings, I lead a little group session to discuss some of the outstanding high-level problems with the two widgets (the regular Task Manager and the Icons-only Task Manager), driven by frequent user reports:

  • Poor experience performing window management on groups of windows
  • Unnecessary duplication in the UI displaying window group contents
  • Unintuitive behavior differences between the two widgets

To address these, we came up with a list of action items to iteratively improve the situation. Individually they're quite minor, but there are many of them, and they will add up to smooth out the user experience considerably. In particular, we'll combine the currently two UIs showing window group contents (the tooltip and the popup dialog) into just one, and we'll make a new code path to cycle through windows in a group in most recently used order on left click the new default. The sprint notes have more details.

Decision-making aside, a personal highlight for me was a live demo of Marco Martin's new desktop widget management implementation. Not only does it look like a joy to use, it also improves the software architecture of Plasma's home screen management in a way that will help Plasma Mobile and other use cases equally. Check out his blog post for more.


I got a new laptop. Slimbook founder Alejandro López made it a proper computer by attaching a particularly swanky metal KDE sticker during the preceding KDE e.V. board sprint.

In KDE e.V. news, briefly we stole one of the sprint rooms for a convenient gathering of most of our Financial Working Group, reviewing the implementation of the annual budget plan of the organization. We also had a chance to work with the Usability goal crew (have you heard about KDE goals yet?) on a plan for the use of their remaining budget -- it's going to be exciting.

As a closing note, it was fantastic to see many new faces at this year's sprint. It's hard to believe for how many attendees it was their first KDE sprint ever, as it couldn't have been more comfortable to have them on board. It's great to see our team grow.

See you next sprint. :)

In more personal news, after just over seven years at the company I'm leaving Blue Systems GmbH at the end of July. It's been a truly fantastic time working every day with some of the finest human beings and hackers. The team there will go on to do great things for KDE and personal computing as a whole, and I'm glad we will keep contributing together to Plasma and other projects we share interests and individual responsibilities in.

As a result, the next ~10 weeks will see me very busy moving continents from Seoul back to my original home town of Berlin, where I'll be starting on a new adventure in October. More on that later (it's quite exciting), but my work on the KDE e.V. board of directors or general presence in the KDE community won't be affected.

That said -- between the physical and career moves, board work and personal preparations for Akademy, I'll probably need to be somewhat less involved and harder to reach in the various project trenches during this quarter. Sorry for that, and do poke hard if you need me to pick up something I've missed.

And of course:


KDE Applications 19.08 branches created

Monday 15th of July 2019 07:22:21 PM

Make sure you commit anything you want to end up in the KDE Applications 19.08 release to them

We're already past the dependency freeze.

The Freeze and Beta is this Thursday 18 of July.

More interesting dates
August 1, 2019: KDE Applications 19.08 RC (19.07.90) Tagging and Release
August 8, 2019: KDE Applications 19.08 Tagging
August 15, 2019: KDE Applications 19.08 Release

https://community.kde.org/Schedules/Applications/19.08_Release_Schedule

Kate LSP Client Continued

Sunday 14th of July 2019 01:33:00 PM

The new LSP client by Mark Nauwelaerts made nice progress since the LSP client restart post last week.

Reminder: The plugin is not compiled per default, you can turn it on via:

cmake -DCMAKE_INSTALL_PREFIX=“your prefix” -DENABLE_LSPCLIENT=ON “kate src dir”

The code can still be found kate.git master, see lspclient in the addons directory.

What is new?

  • Diagnostics support: A tab in the LSP client toolview will show the diagnistics, grouped by file with links to jump to the locations. Issues will be highlighted in the editor view, too.

  • Find references: Find all references for some variable/function in your complete program. They are listed like the diagnostics grouped per file in an extra tab.

  • Improved document highlight: Highlight all occurrences of a variable/… inside the current document. Beside highlighting the reads/writes/uses, you get a jump list like for the other stuff as tab, too.

A feature I missed to show last time:

  • Hover support: Show more meta info about a code location, like the proper type, useful e.g. for almost-always-auto C++ programming.

We even got already two patches for the fresh plugin:

Both are aimed to improve the support of the Rust LSP server. As you can see, they got already reviewed and merged.

Feel welcome to show up on kwrite-devel@kde.org and help out! All development discussions regarding this plugin happen there.

If you are already familiar with Phabricator, post some patch directly at KDE’s Phabricator instance.

You want more LSP servers supported? You want to have feature X? You have seen some bug and want it to vanish? => Join!

KDE Usability & Productivity: Week 79

Sunday 14th of July 2019 04:01:27 AM

After a somewhat light week, we’ve back with week 79 in KDE’s Usability & Productivity initiative, and there’s a ton of cool stuff for you!

New Features Bugfixes & Performance Improvements User Interface Improvements

Next week, your name could be in this list! Not sure how? Just ask! I’ve helped mentor a number of new contributors recently and I’d love to help you, too! You can also check out https://community.kde.org/Get_Involved, and find out how you can help be a part of something that really matters. You don’t have to already be a programmer. I wasn’t when I got started. Try it, you’ll like it! We don’t bite!

If you find KDE software useful, consider making a tax-deductible donation to the KDE e.V. foundation.

KDE Craft Packager on macOS

Saturday 13th of July 2019 08:43:27 AM

In Craft, to create a package, we can use craft --package <blueprint-name> after the compiling and the installing of a library or an application with given blueprint name.

On macOS, MacDMGPackager is the packager used by Craft. The MacDylibBundleris used in MacDMGPackager to handle the dependencies.

In this article, I’ll give a brief introduction of the two classes and the improvement which I’ve done for my GSoC project.

MacDMGPackager

MacDMGPackager is a subclass of CollectionPackagerBase. Its most important method is createPackage.

First of all,

1
self.internalCreatePackage(seperateSymbolFiles=packageSymbols)
Initialisation of directory variables

Here we get the definitions, the path of the application which we want to pack, and the path of archive.
The appPath should be the root of an application package with .app extension name. According to the convention of applications on macOS, targetLibdir points to the library directory of the application.
During the compiling and the installing period, in the application directory, there is only a .plist and MacOS subdirectory. So next, the library directory is created for further using.

1
2
3
4
5
6
defines = self.setDefaults(self.defines)
appPath = self.getMacAppPath(defines)
archive = os.path.normpath(self.archiveDir())
# ...
targetLibdir = os.path.join(appPath, "Contents", "Frameworks")
utils.createDir(targetLibdir)
Moving files to correct directories

Then, we predefine a list of pairs of source and destination for directories and move the files to the destinations. The destionations are the correct directories of libraries, plugins and resources in a macOS application package.

1
2
3
4
5
6
7
8
9
10
11
12
13
moveTargets = [
(os.path.join(archive, "lib", "plugins"), os.path.join(appPath, "Contents", "PlugIns")),
(os.path.join(archive, "plugins"), os.path.join(appPath, "Contents", "PlugIns")),
(os.path.join(archive, "lib"), targetLibdir),
(os.path.join(archive, "share"), os.path.join(appPath, "Contents", "Resources"))]

if not appPath.startswith(archive):
moveTargets += [(os.path.join(archive, "bin"), os.path.join(appPath, "Contents", "MacOS"))]

for src, dest in moveTargets:
if os.path.exists(src):
if not utils.mergeTree(src, dest):
return False
Fixing dependencies using MacDylibBundler

After the moving, we create an instance of MacDylibBundler with appPath. After the with instruction, all the codes are executed with DYLD_FALLBACK_LIBRARY_PATH=<package.app>/Contents/Frameworks:<Craft-Root>/lib environment variable.

For further reading of this environment variable, please refer this question on StackOverFlow.

1
2
3
dylibbundler = MacDylibBundler(appPath)
with utils.ScopedEnv({'DYLD_FALLBACK_LIBRARY_PATH': targetLibdir + ":" + os.path.join(CraftStandardDirs.craftRoot(), "lib")}):
# ...
Fixing dependencies of main binary

Here, we firstly create an object of Path. It points to the executable of macOS Package.

It should be reminded that, although here, we use the same name for both the macOS application package and the executable, it is not mandatory. The name of executable is defined by CFBundleExecutable in the .plist file. So maybe read it from the .plist file is a better solution.

Then, the method bundleLibraryDependencies is used to copy libraries and fix dependencies for the executable in the package.

A brief introduction of this method:

  1. Call utils.getLibraryDeps for getting a list of dependencies. This operation is done by using otool -L.
  2. Copy missing dependencies into Contents/Frameworks, and update the library information in the executable.
    I’ll give an analyse in detail in the next chapter.
1
2
3
4
CraftCore.log.info("Bundling main binary dependencies...")
mainBinary = Path(appPath, "Contents", "MacOS", defines['appname'])
if not dylibbundler.bundleLibraryDependencies(mainBinary):
return False
Fixing dependencies in Frameworks and PlugIns

And then, we try to fix all the dependencies of libraries in Contents/Frameworks and Contents/PlugIns.

1
2
3
4
5
6
7
# Fix up the library dependencies of files in Contents/Frameworks/
CraftCore.log.info("Bundling library dependencies...")
if not dylibbundler.fixupAndBundleLibsRecursively("Contents/Frameworks"):
return False
CraftCore.log.info("Bundling plugin dependencies...")
if not dylibbundler.fixupAndBundleLibsRecursively("Contents/PlugIns"):
return False
Fixing dependencies using macdeployqt

The macdeployqt is used to fix the Qt libraries used by the application. Craft installed it while compiling and installing Qt. But don’t worry, it is not in your system path.

I have not yet found what macdeployqt exactly do, it’s nice to have an look at its source code.

1
2
if not utils.system(["macdeployqt", appPath, "-always-overwrite", "-verbose=1"]):
return False
Removing files in blacklist

If macdeplyqt added some files which we don’t want, they would be removed here.

1
2
3
4
5
6
7
8
9
# macdeployqt might just have added some explicitly blacklisted files
blackList = Path(self.packageDir(), "mac_blacklist.txt")
if blackList.exists():
pattern = [self.read_blacklist(str(blackList))]
# use it as whitelist as we want only matches, ignore all others
matches = utils.filterDirectoryContent(appPath, whitelist=lambda x, root: utils.regexFileFilter(x, root, pattern), blacklist=lambda x, root:True)
for f in matches:
CraftCore.log.info(f"Remove blacklisted file: {f}")
utils.deleteFile(f)
Fixing dependencies after fixing of macdeployqt

After macdeplotqt, there may be some libraries or plugins added by macdeplotqt. So we do the fixing of dependencies once again.

But I’m doubting if we need to fix twice the dependencies. I’ll update this post after I figure out what will it lead to if we fust fix after macdeployqt.

1
2
3
4
5
6
7
8
9
# macdeployqt adds some more plugins so we fix the plugins after calling macdeployqt
dylibbundler.checkedLibs = set() # ensure we check all libs again (but
# we should not need to make any changes)
CraftCore.log.info("Fixing plugin dependencies after macdeployqt...")
if not dylibbundler.fixupAndBundleLibsRecursively("Contents/PlugIns"):
return False
CraftCore.log.info("Fixing library dependencies after macdeployqt...")
if not dylibbundler.fixupAndBundleLibsRecursively("Contents/Frameworks"):
return False
Checking dependencies

Then, we use MacDylibBundler to check all dependencies in the application package. If there is any bad dependency, the package process will fail.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
# Finally sanity check that we don't depend on absolute paths from the builder
CraftCore.log.info("Checking for absolute library paths in package...")
found_bad_dylib = False # Don't exit immeditately so that we log all the bad libraries before failing:
if not dylibbundler.areLibraryDepsOkay(mainBinary):
found_bad_dylib = True
CraftCore.log.error("Found bad library dependency in main binary %s", mainBinary)
if not dylibbundler.checkLibraryDepsRecursively("Contents/Frameworks"):
CraftCore.log.error("Found bad library dependency in bundled libraries")
found_bad_dylib = True
if not dylibbundler.checkLibraryDepsRecursively("Contents/PlugIns"):
CraftCore.log.error("Found bad library dependency in bundled plugins")
found_bad_dylib = True
if found_bad_dylib:
CraftCore.log.error("Cannot not create .dmg since the .app contains a bad library depenency!")
return False
Creating DMG image

Up to now, everything is well, we can create a DMG image for the application.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
name = self.binaryArchiveName(fileType="", includeRevision=True)
dmgDest = os.path.join(self.packageDestinationDir(), f"{name}.dmg")
if os.path.exists(dmgDest):
utils.deleteFile(dmgDest)
appName = defines['appname'] + ".app"
if not utils.system(["create-dmg", "--volname", name,
# Add a drop link to /Applications:
"--icon", appName, "140", "150", "--app-drop-link", "350", "150",
dmgDest, appPath]):
return False

CraftHash.createDigestFiles(dmgDest)

return True

An example of DMG image is like this one, users can drag the application into Applications directory to install it.

MacDylibBundlerConstructor1
2
3
4
def __init__(self, appPath: str):
# Avoid processing the same file more than once
self.checkedLibs = set()
self.appPath = appPath

In the constructor, a set is created to store the libraries which have been already checked. And the appPath passed by developer is stored.

Methods

This method bundleLibraryDependencies and _addLibToAppImage are the most important methods in this class. But they’re too long. So I’ll only give some brief introduction of them.

_addLibToAppImage checks whether a library is already in the Contents/Frameworks. If the library doesn’t exist, it copies it into the diretory and tries to fix it with some relative path.

1
2
def _addLibToAppImage(self, libPath: Path) -> bool:
# ...

bundleLibraryDependencies checks the dependencies of fileToFix. If there are some dependencies with absolute path, it copies the dependencies into Contents/Frameworks by calling _addLibToAppImage. And then, it calls _updateLibraryReference to update the reference of library.

1
2
def bundleLibraryDependencies(self, fileToFix: Path) -> bool:
# ...

As description in the docstring, fixupAndBundleLibsRecursively can remove absolute references and budle all depedencies for all dylibs.

It traverses the directory, and for each file which is not symbol link, checks whether it ends with “.so” or “.dylib”, or there is “.so.” in the file name, or there is “.framework” in the full path and it’s a macOS binary. If it’s that case, call bundleLibraryDependencies method to bundle it in to .app package.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
def fixupAndBundleLibsRecursively(self, subdir: str):
"""Remove absolute references and budle all depedencies for all dylibs under :p subdir"""
# ...
for dirpath, dirs, files in os.walk(os.path.join(self.appPath, subdir)):
for filename in files:
fullpath = Path(dirpath, filename)
if fullpath.is_symlink():
continue # No need to update symlinks since we will process the target eventually.
if (filename.endswith(".so")
or filename.endswith(".dylib")
or ".so." in filename
or (f"{fullpath.name}.framework" in str(fullpath) and utils.isBinary(str(fullpath)))):
if not self.bundleLibraryDependencies(fullpath):
CraftCore.log.info("Failed to bundle dependencies for '%s'", os.path.join(dirpath, filename))
return False
# ...

areLibraryDepsOkay can detect all the dependencies. If the library is not in @rpath, @executable_path or system library path, the dependencies cannot be satisfied on every mac. It may work relevant to the environment. But it will be a big problem.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
def areLibraryDepsOkay(self, fullPath: Path):
# ...
for dep in utils.getLibraryDeps(str(fullPath)):
if dep == libraryId and not os.path.isabs(libraryId):
continue # non-absolute library id is fine
# @rpath and @executable_path is fine
if dep.startswith("@rpath") or dep.startswith("@executable_path"):
continue
# Also allow /System/Library/Frameworks/ and /usr/lib:
if dep.startswith("/usr/lib/") or dep.startswith("/System/Library/Frameworks/"):
continue
if dep.startswith(CraftStandardDirs.craftRoot()):
CraftCore.log.error("ERROR: %s references absolute library path from craftroot: %s", relativePath,
dep)
elif dep.startswith("/"):
CraftCore.log.error("ERROR: %s references absolute library path: %s", relativePath, dep)
else:
CraftCore.log.error("ERROR: %s has bad dependency: %s", relativePath, dep)
found_bad_lib = True

Here, in checkLibraryDepsRecursively, we traverse the directory to check all the dependencies of libraries, which is .dylib or .so.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
def checkLibraryDepsRecursively(self, subdir: str):
# ...
for dirpath, dirs, files in os.walk(os.path.join(self.appPath, subdir)):
for filename in files:
fullpath = Path(dirpath, filename)
if fullpath.is_symlink() and not fullpath.exists():
CraftCore.log.error("Found broken symlink '%s' (%s)", fullpath,
os.readlink(str(fullpath)))
foundError = True
continue

if filename.endswith(".so") or filename.endswith(".dylib") or ".so." in filename:
if not self.areLibraryDepsOkay(fullpath):
CraftCore.log.error("Found library dependency error in '%s'", fullpath)
foundError = True
# ...
Static methods in class

The _updateLibraryReference method can use install_name_tool -change command to change a reference of dynamic library in a macOS/BSD binary.

1
2
3
4
5
6
7
8
9
10
11
@staticmethod
def _updateLibraryReference(fileToFix: Path, oldRef: str, newRef: str = None) -> bool:
if newRef is None:
basename = os.path.basename(oldRef)
newRef = "@executable_path/../Frameworks/" + basename
with utils.makeWritable(fileToFix):
if not utils.system(["install_name_tool", "-change", oldRef, newRef, str(fileToFix)], logCommand=False):
CraftCore.log.error("%s: failed to update library dependency path from '%s' to '%s'",
fileToFix, oldRef, newRef)
return False
return True

The _getLibraryNameId method can use otool -D to get the identity of a dynamic library in a macOS/BSD binary.

1
2
3
4
5
6
7
8
9
10
@staticmethod
def _getLibraryNameId(fileToFix: Path) -> str:
libraryIdOutput = io.StringIO(
subprocess.check_output(["otool", "-D", str(fileToFix)]).decode("utf-8").strip())
lines = libraryIdOutput.readlines()
if len(lines) == 1:
return ""
# Should have exactly one line with the id now
assert len(lines) == 2, lines
return lines[1].strip()

The _fixupLibraryId method can use install_name_tool -id to try to fix the absolute identity of a dynamic library in a macOS/BSD binary.

1
2
3
4
5
6
7
8
9
10
11
@classmethod
def _fixupLibraryId(cls, fileToFix: Path):
libraryId = cls._getLibraryNameId(fileToFix)
if libraryId and os.path.isabs(libraryId):
CraftCore.log.debug("Fixing library id name for %s", libraryId)
with utils.makeWritable(fileToFix):
if not utils.system(["install_name_tool", "-id", os.path.basename(libraryId), str(fileToFix)],
logCommand=False):
CraftCore.log.error("%s: failed to fix absolute library id name for", fileToFix)
return False
# ...
Conclusion

This class is a magic class which can achieve almost everything on macOS.

But the code style is a little confusing. And the parameters are not agreed. Some methods use str to represent a path, some use Path.

Maybe this can be also improved in the future.

Anyway, it’s really a helpful class.

Improvement

During my bonding period, I found that there is a library named qca-qt5 is not fixed appropriately. It caused a crash.

Locating the problem

After analyzing of crash log, I found that the library qca-qt5 is loaded twice. Two libraries with same dynamic library id caused this crash.
1
2
qca-qt5 (0) <14AD33D7-196F-32BB-91B6-598FA39EEF20> /Volumes/*/kdeconnect-indicator.app/Contents/Frameworks/qca-qt5
(??? - ???) <14AD33D7-196F-32BB-91B6-598FA39EEF20> /Users/USER/*/qca-qt5.framework/Versions/2.2.0/qca-qt5

One is in the .app package, the other is in CraftRoot/lib.

As far as I know, qca-qt5 tried to search its plugins in some path. The one in the package is not fixed, so it started a searching of plugins in the CraftRoot/lib directory. The plugins in it refer the qca-qt5 in the directory. So the two libraries with the same name are loaded, and the application crashed.

Cause

With good knowing of MacDylibBundler, we can improve it to fix the bug. And this will be helpful to other applications or libraries in Craft.

I noticed that all the libraries with .dylib can be handled correctly. The problem is based on the libraries in the .framework package. It seems that Craft cannot handle the dynamic libraries in the .framework correctly.

And we can see that, in checkLibraryDepsRecursively, only .so and .dylib are checked. So this is a bug covered deeply.

1
2
3
4
5
6
7
CRAFT: ➜ MacOS otool -L kdeconnectd
kdeconnectd:
/Volumes/Storage/Inoki/CraftRoot/lib/libkdeconnectcore.1.dylib (compatibility version 1.0.0, current version 1.3.3)
/Volumes/Storage/Inoki/CraftRoot/lib/libKF5KIOWidgets.5.dylib (compatibility version 5.0.0, current version 5.57.0)
/Volumes/Storage/Inoki/CraftRoot/lib/libKF5Notifications.5.dylib (compatibility version 5.0.0, current version 5.57.0)
/Volumes/Storage/Inoki/CraftRoot/lib/qca-qt5.framework Versions/2.2.0/qca-qt5 (compatibility version 2.0.0, current version 2.2.0)
...

In the _addLibToAppImage method, the library in the framework is copied directly to the Contents/Frameworks. For example, lib/qca-qt5.framework/Versions/2.2.0/qca-qt5 becomes Contents/Frameworks/qca-qt5.

And then, during the fix in fixupAndBundleLibsRecursively method, according to the following code, it will not be fixed. Although it should be in a .framework directory and it’s a binary, after _addLibToAppImage, it will not be in a .framework directory. So it will not be fixed.
1
2
3
4
5
6
7
if (filename.endswith(".so")
or filename.endswith(".dylib")
or ".so." in filename
or (f"{fullpath.name}.framework" in str(fullpath) and utils.isBinary(str(fullpath)))):
if not self.bundleLibraryDependencies(fullpath):
CraftCore.log.info("Failed to bundle dependencies for '%s'", os.path.join(dirpath, filename))
return False

Fixing it !

To fix it, I think a good idea is copying all the .framework directory and keeping its structure.

I firstly do a checking in the _addLibToAppImage method. For example, if qca-qt5 is in the qca-qt5.framework subdirectory, we change the libBasename to qca-qt5.framework/Versions/2.2.0/qca-qt5. So the targetPath can also be updated correctly.

1
2
3
4
5
6
7
8
9
libBasename = libPath.name

# Handle dylib in framework
if f"{libPath.name}.framework" in str(libPath):
libBasename = str(libPath)[str(libPath).find(f"{libPath.name}.framework"):]

targetPath = Path(self.appPath, "Contents/Frameworks/", libBasename)
if targetPath.exists() and targetPath in self.checkedLibs:
return True

After several checkings, an important section is copying the library. I add some code to check if the library is in a .framework directory. If a library is in a .framework directory, I try to copy the entire directory to the Contents/Frameworks. So for qca-qt5, it should be Contents/Frameworks/qca-qt5.framework/Versions/2.2.0/qca-qt5.

1
2
3
4
5
6
7
8
9
10
if not targetPath.exists():
if f"{libPath.name}.framework" in str(libPath):
# Copy the framework of dylib
frameworkPath = str(libPath)[:(str(libPath).find(".framework") + len(".framework"))]
frameworkTargetPath = str(targetPath)[:(str(targetPath).find(".framework") + len(".framework"))]
utils.copyDir(frameworkPath, frameworkTargetPath, linkOnly=False)
CraftCore.log.info("Added library dependency '%s' to bundle -> %s", frameworkPath, frameworkTargetPath)
else:
utils.copyFile(str(libPath), str(targetPath), linkOnly=False)
CraftCore.log.info("Added library dependency '%s' to bundle -> %s", libPath, targetPath)

After copying, another important point is in _updateLibraryReference. If a library is in a .framework directory, the new reference should be @executable_path/../Frameworks/*.framework/....

1
2
3
4
5
6
7
if newRef is None:
basename = os.path.basename(oldRef)
if f"{basename}.framework" in oldRef:
# Update dylib in framework
newRef = "@executable_path/../Frameworks/" + oldRef[oldRef.find(f"{basename}.framework"):]
else:
newRef = "@executable_path/../Frameworks/" + basename

After fixing, the executable can be launched without crash.

1
2
3
4
5
6
7
8
9
10
11
12
CRAFT: ➜ MacOS otool -L kdeconnectd
kdeconnectd:
@executable_path/../Frameworks/libkdeconnectcore.1.dylib (compatibility version 1.0.0, current version 1.3.3)
@executable_path/../Frameworks/libKF5KIOWidgets.5.dylib (compatibility version 5.0.0, current version 5.57.0)
@executable_path/../Frameworks/libKF5Notifications.5.dylib (compatibility version 5.0.0, current version 5.57.0)
@executable_path/../Frameworks/qca-qt5.framework/Versions/2.2.0/qca-qt5 (compatibility version 2.0.0, current version 2.2.0)
...
CRAFT: ➜ MacOS ./kdeconnectd
kdeconnect.core: KdeConnect daemon starting
kdeconnect.core: onStart
kdeconnect.core: KdeConnect daemon started
kdeconnect.core: Broadcasting identity packet
Conclusion

In the software development, there are always some cases which we cannot consider. Open Source gives us the possibility of collecting intelligence from people all over the world to handle such cases.

That’s also why I like Open Source so much.

Today is the first day of coding period, I hope all goes well for the community and all GSoC students :)

KDE Itinerary - Vector Graphic Barcodes

Saturday 13th of July 2019 07:45:00 AM

I have previously written about why we are interested in barcodes for the KItinerary extractor. This time it’s more about the how, specifically how we find and decode vector graphic barcodes in PDF files, something KItinerary wasn’t able to do until very recently.

Raster Graphics

While PDF is a vector graphics format, most barcodes we encounter in there are actually stored as images. Technically this might not be the cleanest or most efficient way, but it makes KItinerary’s life very easy: We just iterate over all images found in the PDF, and feed them into the barcode decoder.

It’s of course a bit more complicated to make this as efficient as possible, but conceptually you could script this with Poppler’s pdfimages command line tool and ZXing with just a few lines of code.

Vector Graphics

There are also providers that use vector graphics to represent barcodes in their PDF documents, for example Iberia, easyJet, Ryanair and Aer Lingus, enough to make this a relevant problem for KItinerary. The basic idea would be to render the relevant area of the document into an image and feed that into the barcode decoder. The rendering part is straightforward since Poppler has API for that, but how do we know where to look for a vector graphics barcode?

Answering that required a bit of digging into the PDF files, to understand how the barcodes are actually represented. Lacking a “GammaRay for PDF”, Inkscape turned out to be of great help. Importing PDF files there gives you both a graphical and a “textual” (via the generated SVG) representation of the PDF content. This showed three different variants:

  1. A single complex filled path for the entire barcode.
  2. A set of small filled paths (typically quads), for each line or dot of the barcode.
  3. A set of interrupted line strokes with a sufficiently wide pen, so draw the barcode as “scanlines”.

Case (1) is the most easy one, path fill operations with a solid black brush and hundreds or more path elements within a bounding box of just a few centimeters are very rare for anything else, even more so when filtering out paths with curve elements.

The other two cases are much harder to detect without properly grouping all the involved drawing operations though. Here again Inkscape helped, as in all cases the barcodes were represented as an SVG group there, and Inkscape’s PDF import code contained the necessary hints on how to replicate that grouping in KItinerary.

So in the end we iterate over groups of path fill and line stroke operations found in the document, check them for being plausible barcodes by looking at brush or pen properties, path complexity, output size, etc, and then render them to a raster image. The last two steps are expensive, so it’s important we discard as many false positives before we get there.

As a result all remaining PDF documents with previously undetected barcodes in my sample collection now work, with minimal extra runtime cost.

Poppler’s Private API

While I’m quite happy with the result, it unfortunately comes at a cost, in form of a much stronger dependency on Poppler’s private API. KItinerary is already using Poppler’s private API for iterating over the images in a document, which makes distributors understandably very unhappy. For this dependency we had a plan on how to address it by adding the necessary features to Poppler’s public API (at the cost of processing the same document twice, once for text and once for images).

The new code however heavily relies on access to the low-level stream of drawing operations, which is a much much larger API surface to expose from Poppler than just iterating over image assets. Seeing that Inkscape has the same problem, maybe that is actually necessary though?

Contribute

This work heavily relies on access to a large variety of sample documents, to make sure we support all relevant cases. So if you encounter an airline boarding pass PDF file that isn’t detected as such with the current master branch or the upcoming 19.08 release, I’d be very interested in that test case :)

The new userbase wiki

Friday 12th of July 2019 09:35:35 AM

I’m happy to announce that the userbase wiki is getting a new theme and an updated MediaWiki version.

New theme - Aether

The old userbase theme was called Neverland and looked a bit antiquated. A new theme was created with a similar look to kde.org.

The new theme features a light and dark modes using the new prefers-color-scheme: dark CSS media query. The new theme is also mobile friendly.

I think this is quite an improvement over this:

I am confident that Claus_Chr and me found most of the visual glitches, but if you do find a glitch, please report it to me on my talk page.

The new theme is hosted in KDE gitlab instance. Contributions are welcome.

New MediaWiki version

We jumped MediaWiki from the obsolete version 1.26 to 1.31, the latest LTS version. This should fix some of the long-standing bugs and allow us to get all security updates with minimal maintenance needs.

What’s next?

A similar update for the community and techbase wikis should be comming soon™. The only thing that we still need to work is an update of the configuration files and some testing to make sure nothing broke during the update. A preview version of the community wiki can already be tested at: wikisandbox.kde.org.

Contribute to Userbase

When you find a kool feature in KDE software, you can write a small tutorial or just a small paragraph about it and the KDE Userbase Wiki is the right place to publish it. You don’t need to know how to code, have perfect English or know how MediaWiki’s formatting work, to contribute. We also need translators.

Admire my GIMP skills ;)

Thanks to Blumen Herzenschein and Paul Brown for proofreading this blog post and to Ben Cooksley for pointing me to the right direction.

Discussion: Reddit or Mastodon

Guest post: Coloring book & wall art created with Krita

Friday 12th of July 2019 08:54:05 AM

On July 6th we launched Dream Ripple, an art studio located in Minneapolis, MN. We’d like to share a bit about who we are and how Krita aided us in creating our launch project – Wandering: a coloring book and wall art collection that features 50 hand-drawn illustrations of peculiar line-organisms.

We formed Dream Ripple out of a desire to create artwork with the hope to inspire curiosity in others. For a long time, Joe had been experimenting with an unusual abstract line style for doodles, fun drawings, and cards. After wandering through a craft store together, we got really inspired by how creative and fun the coloring books were and it motivated us to try and create one!

We found Krita online after looking for software focused on drawing, illustration, & painting. After a bit of experimenting, it quickly became apparent that Krita provided the toolset needed for our hand-drawn style. Our process was fairly straightforward: we started with pencil sketches, scanned them into Krita, and used a combination of the Stabilizer Brush and Bezier Curve Tool to create crisp uniform lines while still trying to retain the organic feel of the hand-drawn sketch. We’d then print out the illustrations, mark-up design adjustments with a red pen, and revise in Krita over and over until we were happy with it.

For the wall art color variations, we used the Fill Tool to color the areas between the lines. Since we used flat colors, we were able to add an additional 200 color variations to the 50 illustrations fairly quickly.

Also, being free and open source software, Krita allowed us to take time to work without the pressure of a subscription service. That accessibility is something we think is valuable to allow artists to take time to learn their craft without worry of a financial burden.

Here are links to our website, the specific project pages, and one of our wall art stores to see all 50 designs and the 200 color variations:

https://www.dreamripple.com/
https://www.dreamripple.com/wandering-coloring-book/
https://www.dreamripple.com/wandering-wall-art/
https://dream-ripple.pixels.com/art

You can follow us on:

Instagram: @dreamripple
Facebook: @dreamripple
Pinterest: @dreamripple
Twitter: @dreamripple_

Thanks for reading!

Kayla & Joe

Kdenlive 19.04.3 is out

Friday 12th of July 2019 04:00:26 AM

While the team is out for a much deserved summer break the last minor release post-refactoring is out with another huge amount of fixes. The highlights include fixing compositing and speed effect regressions, thumbnail display issues of clips in the timeline and many Windows fixes. With this release we finished polishing the rough edges and now we can focus on adding new features while fixing other small details left. As usual you can get the latest AppImage from our download page.

Speaking of that, the next major release is less than a month away and it already has some cool new features implemented like changing the speed of a clip by ctrl + resize and pressing shift and hover over a thumb of a clip in the Project Bin to preview it. We’ve also bumped the Qt version to 5.12.4 and updated to the latest MLT. You can grab it from here to test it. Also planned is finishing the 3 point editing workflow and improvements to the speed effect. Stay tuned for more info soon.

Bugfixes:

  • Fix tools cursor when hovering a clip in timeline. Commit.
  • Ensure we don’t put a video stream in audio streams in mp3. Commit.
  • Fix loading .mlt playlist can corrupt project profile. Commit.
  • When opening a project file with missing proxy and clip, don’t remove clips from timeline. Commit.
  • Improve main item when grabbing. Commit.
  • Fix reloading of title clips and others. Commit. Fixes bug #409569
  • Update Appdata for 19.04.3 release. Commit.
  • Fix opening of project files with special character. Commit. Fixes bug #409545
  • Fix reloading playlist doesn’t update out. Commit.
  • Don’t leak Mlt repository on first run (attempt to fix Windows fail on first run). Commit.
  • Warn and try fixing clips that are in timeline but not in bin. Commit.
  • Fix timeline tracks config button only showing menu when clicking its arrow. Commit.
  • Fix lambda not called regression. Commit.
  • Don’t hardcode width of clip/composition resize handles. Commit.
  • Fix missing luma error on project opening with AppImage. Commit.
  • Fix reloading clip doesn’t update duration. Commit.
  • Fix overwrite/insert drop leaving audio on wrong track. Commit.
  • Fix error in mirror track calculation. Commit.
  • Fix overwrite clip with speed change. Commit.
  • Fix keyframe corruption on project opening (was creating unexpected keyframe at 0). Commit.
  • Fix keyframes corruption on dragging effect onto another clip. Commit.
  • Fix composition cannot be added after deletion / if another composition is placed just after current pos. Commit.
  • Fix fades broken on speed change. Commit. Fixes bug #409159
  • Fix speed job overwrites without warning. Commit.
  • Fix incorrect crash message on rendering finished. Commit.
  • Fix timeline preview when fps != 25. Commit.
  • Fix tests. Commit.
  • Effectstack: don’t display keyframes that are outside of clip. Commit.
  • Cleanup in clip/composition resize UI update. Commit.
  • Fix thread/cache count causing concurrency crashes. Commit.
  • Don’t trigger unnecessary refresh on clip resize. Commit.
  • Fix crash deleting last track. Commit.
  • Fix duplicate clip with speed change on comma locales. Commit.
  • Don’t allow undo/redo while dragging a clip in timeline. Commit.
  • Fix crash on cutting group with a composition. Commit.
  • Fix crash on group cut. Fixes #256. Commit.
  • Fix playlist duration in bin. Commit.
  • Fix crash loading playlist with different fps. Commit.
  • Fix thumbs not displayed in all thumbs view. Commit. See bug #408556
  • Ensure no empty space between thumbs on all thumbs view in timeline. Commit.
  • Some cleanup in audio thumbs. Fix recent regression and bug where audio thumbs were not displayed after extending a clip in timeline. Commit.
  • I18n fixes. Commit.
  • Use i18n for QML. Commit.
  • Fix monitor image hidden after style change. Commit.
  • Fix resize failure leaving clip at wrong size. Commit.
  • Fix XML translation for Generators. Commit.
  • Fix some effects default params on locales with comma. Commit.
  • Fix crash after undo composition deletion. Commit.
  • Fix i18n for QML. Commit.
  • Fix various selection regressions. Commit.
  • Don’t export metadata as url encoded strings. Commit. Fixes bug #408461
  • Fix crash on project close, see #236. Commit.
  • Fix zone rendering with updated MLT. Commit.
  • After undoing deletion, item should not show up as selected. Commit.
  • Fix disable clip broken regression. Commit.
  • Move zoom options to Timeline, remove Duplicate View. Commit.
  • Fix crash on item deletion. Fixes #235. Commit.
  • Fix fade out moving 1 frame right on mouse release. Commit.
  • Major speedup in clip selection that caused several seconds lag on large projects. Commit.
  • Fix changing composition track does not replug it. Commit.
  • Update appdata version(late again sorry). Commit.
  • Fix freeze when moving clip introduced in previous commit. Commit.
  • Fix typo that may prevent display of transcode menu. Commit.
  • Don’t check duration each time a clip is inserted on project load,. Commit.
  • Show progress when loading a document. Commit.
  • Make it possible to assign shortcut to multitrack view. Commit.
  • Allow resizing item start/end on clip in current track if no item is selected. Commit.
  • Fix profile change not applied if user doesn’t want to save current project. Commit. Fixes bug #408372
  • Fix crash on changing project’s fps. Commit. Fixes bug #408373
  • Add .kdenlive project files to the list of allowed clips in a project. Commit. Fixes bug #408299
  • Correctly save and restore rendering properties for the project. Commit.
  • Workaround MLT consumer scaling issue #453 by using multi consumer. Commit. See bug #407678
  • Fix groups keeping keyboard grab state on unselect,. Commit.
  • Fix the remaining compositing issues reported by Harald (mimick the 18.x behavior). Commit.
  • Don’t warn about missing timeline preview chunks on project opening. Commit.
  • Fix forced track composition should indicate state in timeline (yellow background + track name). Commit.
  • Save track compositing mode in project to restore it on load. Commit. Fixes bug #408081

`make -j5 kritaflake`

Thursday 11th of July 2019 02:57:41 AM

At the end of June I finished copy-on-write vector layers. From the very beginning, I have been researching into possibilities to make kritaflake implicitly sharable. In that post I mentioned the way Sean Parent uses for Photoshop, and adapted it for the derived d-pointers in Flake.

Derived d-pointers

TL;DR: We got rid of it.

As I mentioned in the task page, derived d-pointers originally in Flake are a barrier to implicit sharing. One of the reasons is that we need to write more code (either KisSharedDescendent wrapper class, or repeated code for virtual clone functions). Also, derived d-pointers do not actually encapsulate the data in the parent classes – for example, the members in KoShapePrivate are all accessible by descendents of KoShape, say, KoShapeContainer. That is probably not how encapsulating should work. So in the end we decided to get rid of derived d-pointers in Flake.

This leads to one problem, however, in the class KoShapeGroup. KoShapeGroup is a descendent of KoShapeContainer, which owns a KoShapeContainerModel that can be subclassed to control the behaviour when a child is added to or removed from the container. KoShapeGroup uses ShapeGroupContainerModel which performs additional operations specific to KoShapeGroup.

After I merged my branch into master, it was said that Flake tests failed under address sanitizer (ASan). I took a look and discovered that there was use after free in the class KoShapeGroup, namely the use of its d-pointer. The use is called by the destructor of KoShapeContainer, which calls KoShapeContainerModel::deleteOwnedShapes(), which removes individual shapes from the container, which then calls KoShapeGroup::invalidateSizeCache(). The original situation was:

  1. destructor of KoShapeGroup was called;
  2. members defined in KoShapeGroup got deleted (nothing, because everything is in the derived d-pointer which is defined in KoShape);
  3. destructor of KoShapeContainer was called, which calls d->model->deleteOwnedShapes();
  4. then that of KoShape, which deletes all the private members.

But after the derived d-pointers are converted to normal ones, the calling sequence upon destruction becomes:

  1. destructor of KoShapeGroup was called;
  2. members defined in KoShapeGroup got deleted (its own d-pointer);
  3. destructor of KoShapeContainer was called, which calls d->model->deleteOwnedShapes();
  4. d->model is a ShapeGroupContainerModel, which will call KoShapeGroup::invalidateSizeCache();
  5. that last function accesses the d-pointer of KoShapeGroup, USE AFTER FREE.

In order to solve this problem we have to manually call model()->deleteOwnedShapes() in the destructor of KoShapeGroup, at which time the d-pointer is still accessible.

q-pointers

TL;DR: We also got rid of it.

q-pointers are a method used in Qt to hide private methods from the header files, in order to improve binary compatibility. q-pointers are stored in *Private classes (ds), indicating the object that owns this private instance. But this is, of course, conflicting with the principle of “sharing” because the situation now is that multiple objects can own the same data. The q-pointers in flake is rather confusing under such circumstances, since the private data cannot know which object is the caller.

To avoid this confusion, there are multiple ways:

  1. to move all the functions regarding q-pointers to the public classes;
  2. to pass the q-pointer every time when calling those functions in private classes; or
  3. to add another layer of “shared data” in the d-pointer and keep the q-pointers in the unshared part.
implicit sharing

To enable implicit sharing for the KoShape hierarchy, the only thing left to be done is to change the QScopedPointer<Private> d; in the header file to QSharedDataPointer<Private> d; and make the private classes inherit QSharedData. This step is rather easy and then just run the tests to make sure it does not break anything. Horray!

It is coming alive

Thursday 11th of July 2019 12:55:42 AM

After digging for around a month and a half, I can finally do some selections with the Magnetic Lasso tool, which I wrote with utter laziness as I would say.

It is coming alive

Thursday 11th of July 2019 12:55:42 AM

After digging for around a month and a half, I can finally do some selections with the Magnetic Lasso tool, which I wrote with utter laziness as I would say.

It is coming alive

Thursday 11th of July 2019 12:55:42 AM

After digging for around a month and a half, I can finally do some selections with the Magnetic Lasso tool, which I wrote with utter laziness as I would say.

New unit tests for the new code

Wednesday 10th of July 2019 08:43:53 PM
Hello everyone,

today I want to present the test system for Cantor's worksheet.
The worksheet is the most central, prominent and important part of the application where the most work is done.

So, it is important to cover this part with enough tests to ensure the quality and stability of this component in future.

At the moment, this system contains only ten tests and all of them cover the functionality for the import of Jupyter notebooks only that was added recently to Cantor (I have mentioned them in my first post).
However, this test infrastructure is of generic nature and can easily be used for testing Cantor's own Cantor files, too.

The test system checks that a worksheet/notebook file is loaded successfully, tests the backend type and validates the overall worksheet structure and the content of its entries.

Actually, some content is not validated, for example the image content. This would increase the complexity of the tests and slow down their execution without additional big value with respect to the quality assurance.

This new infrastructure has proven to be helpful already. When writing the first tests for the worksheet I have found couple of bugs in the implementation of the import of Jupyter notebooks. After having fixed them and now, having this additional barriers, I'm more confident about the implementation and can say more surely that the import of Jupyter notebooks works fine.

In previous post I have mentioned some issues with the perfromance of the renderer used for mathematical expressions in Cantor. It turned out this problem is not so easy to solve as I assumed first. But now, after having finished a substantial part of the work that was planned to be done as part of this GSoC project, I can give more attention to to remaining problems, including this one with the performance of the renderer.
In the next post I plan to show a better realization of the math renderer in Cantor.

KMyMoney 5.0.5 released

Wednesday 10th of July 2019 01:32:32 PM

The KMyMoney development team today announces the immediate availability of version 5.0.5 of its open source Personal Finance Manager.

After three months it is now ready: KMyMoney 5.0.5 comes with some important bugfixes. As usual, problems have been reported by our users and the development team worked hard to fix them in the meantime. The result of this effort is the brand new KMyMoney 5.0.5 release.

Despite even more testing we understand that some bugs may have slipped past our best efforts. If you find one of them, please forgive us, and be sure to report it, either to the mailing list or on bugs.kde.org.

From here, we will continue to fix reported bugs, and working to add many requested additions and enhancements, as well as further improving performance.

Please feel free to visit our overview page of the CI builds at https://kmymoney.org/build.php and maybe try out the lastest and greatest by using a daily crafted AppImage version build from the stable branch.

The details

Here is the list of the bugs which have been fixed. A list of all changes between v5.0.4 and v5.0.5 can be found in the ChangeLog.

  • 368159 Report Transactions by Payee omits transactions lacking category
  • 390681 OFX import and unrecognized <FITID> tag
  • 392305 Not all Asset accounts are shown during OFX import
  • 396225 When importing a ofx/qif file, it does not show me all my accounts
  • 396978 Stable xml file output
  • 400761 Cannot open files on MacOS
  • 401397 kmymoney changes group permissions
  • 403745 in import dialog, newly-created account doesn’t appear in pulldown menu
  • 403825 Transaction validity filter is reset when re-opening configuration
  • 403826 Transactions without category assignment are not shown in report
  • 403885 Buying / selling investments interest / fees round to 2 decimal places even when currency is to 6 decimal places
  • 403886 No way to set/change investment start date in investment wizard
  • 403955 After an action, the cursor returns to top of page and does not remain in a similar position to when action was started
  • 404156 Can’t select many columns as memo
  • 404848 Crash on “Enter Next Transcation”
  • 405061 No chart printing support
  • 405329 CPU loop reconciling if all transactions are cleared
  • 405817 CSV importer trailing lines are treated as absolute lines
  • 405828 Budget problems
  • 405928 Loss of inserted data in transaction planner
  • 406073 Change of forecast method is not reflected in forecast view
  • 406074 Unused setting “Forecast (history)” for home view
  • 406220 Crash when deleting more than 5000 transactions at once
  • 406509 “Find Transaction…” dialog focus is on “Help” button instead of “Find”
  • 406525 Subtotals are not correctly aggregated when (sub-)categories have the same name
  • 406537 Encrypted file cannot be saved as unencrypted
  • 406608 Custom report based on Annual Budget incorrectly getting Actuals
  • 406714 Home view shows budget header twice

Here is the list of the enhancements which have been added:

  • 341589 Cannot assign tag to a split

Beware of some of the Qt 5.13 deprecation porting hints

Tuesday 9th of July 2019 11:18:17 PM

QComboBox::currentIndexChanged(QString) used to have (i.e. in Qt 5.13.0) a deprecation warning that said "Use currentTextChanged() instead".

That has recently been reverted since both are not totally equivalent, sure, you can probably "port" from one to the other, but the "use" wording to me seems like a "this is the same" and they are not.

Another one of those is QPainter::initFrom, which inits a painter with the pen, background and font to the same as the given widget. This is deprecated, because it's probably wrong ("what is the pen of a widget?") but the deprecation warning says "Use begin(QPaintDevice*)" but again if you look at the implementation, they don't really do the same. Still need to find time to complain to the Qt developers and get it fixed.

Anyhow, as usual, when porting make sure you do a correct port and not just blind changes.

More in Tux Machines

LWN: Spectre, Linux and Debian Development

  • Grand Schemozzle: Spectre continues to haunt

    The Spectre v1 hardware vulnerability is often characterized as allowing array bounds checks to be bypassed via speculative execution. While that is true, it is not the full extent of the shenanigans allowed by this particular class of vulnerabilities. For a demonstration of that fact, one need look no further than the "SWAPGS vulnerability" known as CVE-2019-1125 to the wider world or as "Grand Schemozzle" to the select group of developers who addressed it in the Linux kernel. Segments are mostly an architectural relic from the earliest days of x86; to a great extent, they did not survive into the 64-bit era. That said, a few segments still exist for specific tasks; these include FS and GS. The most common use for GS in current Linux systems is for thread-local or CPU-local storage; in the kernel, the GS segment points into the per-CPU data area. User space is allowed to make its own use of GS; the arch_prctl() system call can be used to change its value. As one might expect, the kernel needs to take care to use its own GS pointer rather than something that user space came up with. The x86 architecture obligingly provides an instruction, SWAPGS, to make that relatively easy. On entry into the kernel, a SWAPGS instruction will exchange the current GS segment pointer with a known value (which is kept in a model-specific register); executing SWAPGS again before returning to user space will restore the user-space value. Some carefully placed SWAPGS instructions will thus prevent the kernel from ever running with anything other than its own GS pointer. Or so one would think.

  • Long-term get_user_pages() and truncate(): solved at last?

    Technologies like RDMA benefit from the ability to map file-backed pages into memory. This benefit extends to persistent-memory devices, where the backing store for the file can be mapped directly without the need to go through the kernel's page cache. There is a fundamental conflict, though, between mapping a file's backing store directly and letting the filesystem code modify that file's on-disk layout, especially when the mapping is held in place for a long time (as RDMA is wont to do). The problem seems intractable, but there may yet be a solution in the form of this patch set (marked "V1,000,002") from Ira Weiny. The problems raised by the intersection of mapping a file (via get_user_pages()), persistent memory, and layout changes by the filesystem were the topic of a contentious session at the 2019 Linux Storage, Filesystem, and Memory-Management Summit. The core question can be reduced to this: what should happen if one process calls truncate() while another has an active get_user_pages() mapping that pins some or all of that file's pages? If the filesystem actually truncates the file while leaving the pages mapped, data corruption will certainly ensue. The options discussed in the session were to either fail the truncate() call or to revoke the mapping, causing the process that mapped the pages to receive a SIGBUS signal if it tries to access them afterward. There were passionate proponents for both options, and no conclusion was reached. Weiny's new patch set resolves the question by causing an operation like truncate() to fail if long-term mappings exist on the file in question. But it also requires user space to jump through some hoops before such mappings can be created in the first place. This approach comes from the conclusion that, in the real world, there is no rational use case where somebody might want to truncate a file that has been pinned into place for use with RDMA, so there is no reason to make that operation work. There is ample reason, though, for preventing filesystem corruption and for informing an application that gets into such a situation that it has done something wrong.

  • Hardening the "file" utility for Debian

    In addition, he had already encountered problems with file running in environments with non-standard libraries that were loaded using the LD_PRELOAD environment variable. Those libraries can (and do) make system calls that the regular file binary does not make; the system calls were disallowed by the seccomp() filter. Building a Debian package often uses FakeRoot (or fakeroot) to run commands in a way that appears that they have root privileges for filesystem operations—without actually granting any extra privileges. That is done so that tarballs and the like can be created containing files with owners other than the user ID running the Debian packaging tools, for example. Fakeroot maintains a mapping of the "changes" made to owners, groups, and permissions for files so that it can report those to other tools that access them. It does so by interposing a library ahead of the GNU C library (glibc) to intercept file operations. In order to do its job, fakeroot spawns a daemon (faked) that is used to maintain the state of the changes that programs make inside of the fakeroot. The libfakeroot library that is loaded with LD_PRELOAD will then communicate to the daemon via either System V (sysv) interprocess communication (IPC) calls or by using TCP/IP. Biedl referred to a bug report in his message, where Helmut Grohne had reported a problem with running file inside a fakeroot.

Flameshot is a brilliant screenshot tool for Linux

The default screenshot tool in Ubuntu is alright for basic snips but if you want a really good one you need to install a third-party screenshot app. Shutter is probably my favorite, but I decided to give Flameshot a try. Packages are available for various distributions including Ubuntu, Arch, openSuse and Debian. You find installation instructions on the official project website. Read more

Android Leftovers

IBM/Red Hat and Intel Leftovers

  • Troubleshooting Red Hat OpenShift applications with throwaway containers

    Imagine this scenario: Your cool microservice works fine from your local machine but fails when deployed into your Red Hat OpenShift cluster. You cannot see anything wrong with the code or anything wrong in your services, configuration maps, secrets, and other resources. But, you know something is not right. How do you look at things from the same perspective as your containerized application? How do you compare the runtime environment from your local application with the one from your container? If you performed your due diligence, you wrote unit tests. There are no hard-coded configurations or hidden assumptions about the runtime environment. The cause should be related to the configuration your application receives inside OpenShift. Is it time to run your app under a step-by-step debugger or add tons of logging statements to your code? We’ll show how two features of the OpenShift command-line client can help: the oc run and oc debug commands.

  • What piece of advice had the greatest impact on your career?

    I love learning the what, why, and how of new open source projects, especially when they gain popularity in the DevOps space. Classification as a "DevOps technology" tends to mean scalable, collaborative systems that go across a broad range of challenges—from message bus to monitoring and back again. There is always something new to explore, install, spin up, and explore.

  • How DevOps is like auto racing

    When I talk about desired outcomes or answer a question about where to get started with any part of a DevOps initiative, I like to mention NASCAR or Formula 1 racing. Crew chiefs for these race teams have a goal: finish in the best place possible with the resources available while overcoming the adversity thrown at you. If the team feels capable, the goal gets moved up a series of levels to holding a trophy at the end of the race. To achieve their goals, race teams don’t think from start to finish; they flip the table to look at the race from the end goal to the beginning. They set a goal, a stretch goal, and then work backward from that goal to determine how to get there. Work is delegated to team members to push toward the objectives that will get the team to the desired outcome. [...] Race teams practice pit stops all week before the race. They do weight training and cardio programs to stay physically ready for the grueling conditions of race day. They are continually collaborating to address any issue that comes up. Software teams should also practice software releases often. If safety systems are in place and practice runs have been going well, they can release to production more frequently. Speed makes things safer in this mindset. It’s not about doing the “right” thing; it’s about addressing as many blockers to the desired outcome (goal) as possible and then collaborating and adjusting based on the real-time feedback that’s observed. Expecting anomalies and working to improve quality and minimize the impact of those anomalies is the expectation of everyone in a DevOps world.

  • Deep Learning Reference Stack v4.0 Now Available

    Artificial Intelligence (AI) continues to represent one of the biggest transformations underway, promising to impact everything from the devices we use to cloud technologies, and reshape infrastructure, even entire industries. Intel is committed to advancing the Deep Learning (DL) workloads that power AI by accelerating enterprise and ecosystem development. From our extensive work developing AI solutions, Intel understands how complex it is to create and deploy applications for deep learning workloads. That?s why we developed an integrated Deep Learning Reference Stack, optimized for Intel Xeon Scalable processor and released the companion Data Analytics Reference Stack. Today, we?re proud to announce the next Deep Learning Reference Stack release, incorporating customer feedback and delivering an enhanced user experience with support for expanded use cases.

  • Clear Linux Releases Deep Learning Reference Stack 4.0 For Better AI Performance

    Intel's Clear Linux team on Wednesday announced their Deep Learning Reference Stack 4.0 during the Linux Foundation's Open-Source Summit North America event taking place in San Diego. Clear Linux's Deep Learning Reference Stack continues to be engineered for showing off the most features and maximum performance for those interested in AI / deep learning and running on Intel Xeon Scalable CPUs. This optimized stack allows developers to more easily get going with a tuned deep learning stack that should already be offering near optimal performance.